I recently saw David Fincher's movie The Social Network and enjoyed the movie immensely. The question I feel that the movie didn't (and perhaps couldn't) answer was the title of this blog.
According to the movie the evidence for:
He turned down $2m from Microsoft for some music play-list software
He was at Harvard in the first place
He wrote Facemash in one night - when drunk!
He's the CEO of one of the most influential start-ups in Silicon Valley history
The evidence against:
He allegedly stole the idea for Facebook
The movie seems to suggest that because he was socially inept and a good hacker that he's a genius. Is that enough? Whatever the truth of the matter the fact remains that he is at the helm of one of the hottest companies in the world whilst still in his twenties. If not a genius then that is evidence of some exceptional talents. Genius or not? I'd say it doesn't really matter after the first billion.
Monday, November 22, 2010
Why do people hate Skyline?
I saw Skyline a couple of weeks ago and thoroughly enjoyed it. I was quite surprised to see the scathing reviews from critics and moviegoers alike.
OK the script isn't great but neither was it that bad either. The cast and characters were OK too but I found the story, direction and effects all above average. IMDB rates this movie at 4.7 (based upon 4000 ratings) when I last checked but I'd score it somewhere in the 7.xs and I'm a pretty harsh movie critic.
So why the discrepancy? Well maybe I had an off day and just enjoyed a crap movie too much, but the doubt at the back of my mind I suspect that there is something more sinister at work here.
It turns out that the Brothers Strausse (silly name I know) effects company Hydraulx Filmz is contracted to work on Sony's big budget alien invasion movie Battle: Los Angeles due for release in early 2011 and that the Strausse Brothers didn't declare their involvement in Skyline to Sony. From what I gather Sony wanted Skyline pulled until after the release of their film. All in all it sounds a bit hypocritical of Sony to me given the acknowledged dirty tricks they've employed in the past
http://news.bbc.co.uk/2/hi/4741259.stm
but hey I wouldn't want to get on their bad side. Whatever the truth of the situation I figure that Skyline is a pretty good movie and when the budget is factored in it becomes a pretty amazing movie. Count me in for Skyline 2. And as for poor old Sony - don't worry - the trailers for Battle: Los Angeles look great too so I'm in for that movie too.
OK the script isn't great but neither was it that bad either. The cast and characters were OK too but I found the story, direction and effects all above average. IMDB rates this movie at 4.7 (based upon 4000 ratings) when I last checked but I'd score it somewhere in the 7.xs and I'm a pretty harsh movie critic.
So why the discrepancy? Well maybe I had an off day and just enjoyed a crap movie too much, but the doubt at the back of my mind I suspect that there is something more sinister at work here.
It turns out that the Brothers Strausse (silly name I know) effects company Hydraulx Filmz is contracted to work on Sony's big budget alien invasion movie Battle: Los Angeles due for release in early 2011 and that the Strausse Brothers didn't declare their involvement in Skyline to Sony. From what I gather Sony wanted Skyline pulled until after the release of their film. All in all it sounds a bit hypocritical of Sony to me given the acknowledged dirty tricks they've employed in the past
http://news.bbc.co.uk/2/hi/4741259.stm
but hey I wouldn't want to get on their bad side. Whatever the truth of the situation I figure that Skyline is a pretty good movie and when the budget is factored in it becomes a pretty amazing movie. Count me in for Skyline 2. And as for poor old Sony - don't worry - the trailers for Battle: Los Angeles look great too so I'm in for that movie too.
Labels:
Battle: Los Angeles,
Brothers Strausse,
Skyline,
Sony
RED One
I don't get the opportunity to go the cinema as often as I'd like so it's a minor miracle I've managed to see two movies over the last couple of weekends. One has been savaged by the critics and the other lauded. The tenuous connection between the movies - they were both shot digitally using RED One cameras.
The first is a sci-fi flick called 'Skyline' which has been savaged by critics and user reviews alike. The question I'm wondering is why? OK the script is a bit hammy here and there but I thought the cast, story, direction and special effects were all fine, even dare I say it good. There's no doubt that the film plagiarizes elements of a whole bunch of other sci-fi movies (War of the Worlds, Cloverfield, Independence Day in particular and to a lesser degree The Matrix and Predator) but then what alien invasion movie doesn't owe something to H.G. Wells original story War of the Worlds? Aside from The Matrix I enjoyed this film more than the others listed above and I think it's the movie that Cloverfield should have been. So why the scathing reviews? Maybe that's a question for Sony and the subject of another post.
The other interesting aspect is that this move allegedly cost $10-15m to make and that included 1000 effects shots. That's amazing as to me it looks like an $80-100m movie. Given that I'm currently working on a $10.5m IT project I find it amazing that a film that looks as good as Skyline can be made for the same money.
Best Scene: When the F22-Raptor takes out an alien.
Best Quote: Well there aren't any!
The second movie is 'The Social Network' and certainly nobody is complaining about Aaron Sorkin's wonderful script, David Fincher's direction of the fantastic cast. I found the movie interesting on a number of levels but mainly in its portrayal of life in a US Ivy League University (Harvard) - they have groupies? - and to a lesser extent the evolution of a tech startup in Palo Alto. What I felt the film didn't really address is a question that's burning in tech circles right now. Is Mark Zukerberg a genius or was he just lucky? Again that's a question for another post.
Best Scene: The Henley Rowing Montage
Best Quote:
Gage: Mr. Zuckerberg, do I have your full attention?
Mark Zuckerberg: [stares out the window] No.
Gage: Do you think I deserve it?
Mark Zuckerberg: [looks at the lawyer] What?
Gage: Do you think I deserve your full attention?
Mark Zuckerberg: I had to swear an oath before we began this deposition, and I don't want to perjure myself, so I have a legal obligation to say no.
Gage: Okay - no. You don't think I deserve your attention.
Mark Zuckerberg: I think if your clients want to sit on my shoulders and call themselves tall, they have the right to give it a try - but there's no requirement that I enjoy sitting here listening to people lie. You have part of my attention - you have the minimum amount. The rest of my attention is back at the offices of Facebook, where my colleagues and I are doing things that no one in this room, including and especially your clients, are intellectually or creatively capable of doing.
[pauses]
Mark Zuckerberg: Did I adequately answer your condescending question?
The first is a sci-fi flick called 'Skyline' which has been savaged by critics and user reviews alike. The question I'm wondering is why? OK the script is a bit hammy here and there but I thought the cast, story, direction and special effects were all fine, even dare I say it good. There's no doubt that the film plagiarizes elements of a whole bunch of other sci-fi movies (War of the Worlds, Cloverfield, Independence Day in particular and to a lesser degree The Matrix and Predator) but then what alien invasion movie doesn't owe something to H.G. Wells original story War of the Worlds? Aside from The Matrix I enjoyed this film more than the others listed above and I think it's the movie that Cloverfield should have been. So why the scathing reviews? Maybe that's a question for Sony and the subject of another post.
The other interesting aspect is that this move allegedly cost $10-15m to make and that included 1000 effects shots. That's amazing as to me it looks like an $80-100m movie. Given that I'm currently working on a $10.5m IT project I find it amazing that a film that looks as good as Skyline can be made for the same money.
Best Scene: When the F22-Raptor takes out an alien.
Best Quote: Well there aren't any!
The second movie is 'The Social Network' and certainly nobody is complaining about Aaron Sorkin's wonderful script, David Fincher's direction of the fantastic cast. I found the movie interesting on a number of levels but mainly in its portrayal of life in a US Ivy League University (Harvard) - they have groupies? - and to a lesser extent the evolution of a tech startup in Palo Alto. What I felt the film didn't really address is a question that's burning in tech circles right now. Is Mark Zukerberg a genius or was he just lucky? Again that's a question for another post.
Best Scene: The Henley Rowing Montage
Best Quote:
Gage: Mr. Zuckerberg, do I have your full attention?
Mark Zuckerberg: [stares out the window] No.
Gage: Do you think I deserve it?
Mark Zuckerberg: [looks at the lawyer] What?
Gage: Do you think I deserve your full attention?
Mark Zuckerberg: I had to swear an oath before we began this deposition, and I don't want to perjure myself, so I have a legal obligation to say no.
Gage: Okay - no. You don't think I deserve your attention.
Mark Zuckerberg: I think if your clients want to sit on my shoulders and call themselves tall, they have the right to give it a try - but there's no requirement that I enjoy sitting here listening to people lie. You have part of my attention - you have the minimum amount. The rest of my attention is back at the offices of Facebook, where my colleagues and I are doing things that no one in this room, including and especially your clients, are intellectually or creatively capable of doing.
[pauses]
Mark Zuckerberg: Did I adequately answer your condescending question?
Labels:
Mark Zuckerberg,
Skyline,
The Social Network,
Web 2.0
Tuesday, March 23, 2010
And the award goes to .... The Hurt Locker
OK I know the Oscars are long finished but I just saw 'The Hurt Locker' at the weekend and I've wanted to tie it into a post for ages.
My two cents worth - I desperately wanted 'The Hurt Locker' to be a worthy winner of six oscars, especially as it won out over 'Avatar'. However, I was disappointed. Whilst it is a good film, it certainly isn't a great one. Best Film and Best Director - I don't think so. But this just reaffirms what I've always believed about luck and timing playing their part in awards ceremonies.
At my last ever Oracle Consulting conference I was led to believe by my Practice Manager that the project team I had lead was up for an Outstanding Performance Award. We had just finished a difficult 4 month project that was a great success. The client was happy and referenceable, the systems integrator couldn't praise us highly enough and was lining us up for more work. And we had achieved all of this with the minimum of fuss. OK, we'd worked a few weekends and late nights but nothing out of the ordinary.
So there I sat at the dinner at the final night of the conference fully expecting my team to be one of those to pick up an award. You guessed it though - we didn't even get a mention. I have to admit I was desperately pee'd off. The project that claimed the prize we'd been promised, by contrast, had been classic car crash IT. Badly managed, poor quality, late, over budget, all hands to the pumps, client threats, the whole lot. Some hours later my Practice Manager came skulking over with some lame explanation that the award was given as recognition for all the 'above and beyond' efforts put in by the other project team.
So my advice to you. If you want to win awards and get recognition go ahead and fcuk up your project and then flog your staff for 18 hours a day to correct your mistakes. Don't, whatever you do, just run a successful project without drama. I left Oracle a few months after that and the irony was that I got a posthumous award for another penultimate project I'd been on. Too little, too late.
My two cents worth - I desperately wanted 'The Hurt Locker' to be a worthy winner of six oscars, especially as it won out over 'Avatar'. However, I was disappointed. Whilst it is a good film, it certainly isn't a great one. Best Film and Best Director - I don't think so. But this just reaffirms what I've always believed about luck and timing playing their part in awards ceremonies.
At my last ever Oracle Consulting conference I was led to believe by my Practice Manager that the project team I had lead was up for an Outstanding Performance Award. We had just finished a difficult 4 month project that was a great success. The client was happy and referenceable, the systems integrator couldn't praise us highly enough and was lining us up for more work. And we had achieved all of this with the minimum of fuss. OK, we'd worked a few weekends and late nights but nothing out of the ordinary.
So there I sat at the dinner at the final night of the conference fully expecting my team to be one of those to pick up an award. You guessed it though - we didn't even get a mention. I have to admit I was desperately pee'd off. The project that claimed the prize we'd been promised, by contrast, had been classic car crash IT. Badly managed, poor quality, late, over budget, all hands to the pumps, client threats, the whole lot. Some hours later my Practice Manager came skulking over with some lame explanation that the award was given as recognition for all the 'above and beyond' efforts put in by the other project team.
So my advice to you. If you want to win awards and get recognition go ahead and fcuk up your project and then flog your staff for 18 hours a day to correct your mistakes. Don't, whatever you do, just run a successful project without drama. I left Oracle a few months after that and the irony was that I got a posthumous award for another penultimate project I'd been on. Too little, too late.
Thursday, March 18, 2010
Cheap at half the price
Here at the Depatment of Hopes and Dreams we're having a lovely spat with one of our software vendors at present. The issue is that my boss is questioning the need to pay a quarter of a million dollars in annual maintenance fees when he believes we only use about $40,000 worth of the product.
The interesting thing is that the software supplier has recalcuted the maintenance bill twice now using different breakdown structures. Both breakdowns don't help clarify exactly what it is that we're paying for. In fact the only figure that remains the same is the invoice amount - $250,000. Funny that.
The wider question is that once you've implemented something and it is successfully bedded in why bother paying maintenance ever again? The software vendor will say that you're our of support and that you original software licence will be revoked but if you do the cost benefit anaylsis you might find witholding the maintenance over say 5 years might pay for the replacement software further down the line. By that time the software might be half the price anyway. It's certainly worth some consideration.
The interesting thing is that the software supplier has recalcuted the maintenance bill twice now using different breakdown structures. Both breakdowns don't help clarify exactly what it is that we're paying for. In fact the only figure that remains the same is the invoice amount - $250,000. Funny that.
The wider question is that once you've implemented something and it is successfully bedded in why bother paying maintenance ever again? The software vendor will say that you're our of support and that you original software licence will be revoked but if you do the cost benefit anaylsis you might find witholding the maintenance over say 5 years might pay for the replacement software further down the line. By that time the software might be half the price anyway. It's certainly worth some consideration.
How much should you pay for software?
I've always had a problem paying lots of money for software, which is odd when you think about it. I'll happily pay for hardware, or to see a movie or to listen to a CD. In fact, the only software I think I pay without much hesitation is a computer game.
Out of the twenty or so apps I've downloaded onto my iPhone the only paid one is the very successful game Flight Control which cost me the princely sum of $1.19.
So at Oracle I was working with a sales rep who was trying to land a $10m deal with a large australian multinational company for a global license I was surpised, given that this account made up about one third of his territory, that he was chasing a deal that would probably limit what he could sell in future years.
We had a discussion about what software was worth and it was illuminating to me. As he pointed out - to him the software was worth the cost of the CD - a few cents and nothing more.
The funny thing is that a few years earlier I worked for a mainframe software house who when the annual results we're due and the numbers looked bad had a dodgy practice of cutting a few tapes bunging them in a storage cupboard and reporting the software as millions worth of assets. (They were later censured by the Stock Exchange for this practice).
I guess the old marketing saying is true - something is only worth what somebody else is prepared to pay for it.
Out of the twenty or so apps I've downloaded onto my iPhone the only paid one is the very successful game Flight Control which cost me the princely sum of $1.19.
So at Oracle I was working with a sales rep who was trying to land a $10m deal with a large australian multinational company for a global license I was surpised, given that this account made up about one third of his territory, that he was chasing a deal that would probably limit what he could sell in future years.
We had a discussion about what software was worth and it was illuminating to me. As he pointed out - to him the software was worth the cost of the CD - a few cents and nothing more.
The funny thing is that a few years earlier I worked for a mainframe software house who when the annual results we're due and the numbers looked bad had a dodgy practice of cutting a few tapes bunging them in a storage cupboard and reporting the software as millions worth of assets. (They were later censured by the Stock Exchange for this practice).
I guess the old marketing saying is true - something is only worth what somebody else is prepared to pay for it.
Big time Brits to whom Apple are indebted
In a previous post I spoke of my amazement at how the UK was the engine of innovation during the industrial revolution. Whilst the UK is no longer the powerhouse it used to be we still probably punch above our weight in certain fields given the we make up 1 in a thousand inhabitants of Planet Earth.
In this post I just wanted to pay tribute to two world renowned Brits who both have connections with Apple and with where I hail from in the UK. One who was born twenty miles away from my hometown and the other studied at Uni some 14 miles away.
They are Ridley Scott (born in South Shields) and Jonathan Ive (who studied at Newcastle Polytechnic). The Jony Ive connection with Apple is obvious as he is their SVP of Design.
The other connection is that Ridley Scott directed the Apple's '1984 Big Brother' Superbowl commercial that is still regarded nearly twenty years after its broadcast as one of the most iconic and important commercials ever made.
The question I'm asking is where are Microsoft's or Google's Brit, and indeed North East, connection?
In this post I just wanted to pay tribute to two world renowned Brits who both have connections with Apple and with where I hail from in the UK. One who was born twenty miles away from my hometown and the other studied at Uni some 14 miles away.
They are Ridley Scott (born in South Shields) and Jonathan Ive (who studied at Newcastle Polytechnic). The Jony Ive connection with Apple is obvious as he is their SVP of Design.
The other connection is that Ridley Scott directed the Apple's '1984 Big Brother' Superbowl commercial that is still regarded nearly twenty years after its broadcast as one of the most iconic and important commercials ever made.
The question I'm asking is where are Microsoft's or Google's Brit, and indeed North East, connection?
Who would play Bill Gates in the movies?
I loved the movie 'Wargames' starring Matthew Broderick and always felt that the erratic and reclusive computer genius, Professor Falken, was loosly based upon Steve Jobs. I had much less time for the film 'Verical Limit' although the ruthless & driven US billionaire character in it reminded me very much of Larry Ellison.
The funny thing is that I've never seen a cinematic incarnation of Bill Gates portrayed on the silver screen. I suppose that this is something to do with a perceived lack of charisma. I don't pity Bill, however, as no matter what your thoughts on Microsoft's innovation, style or business practices he and his company have been hugely influential for the last 20 years.
People who try to ridicule him for the infamous quite about 640K of RAM or the failed Tablet PC just don't seem to get that he is probably the most successful opportunist who has ever lived.
The funny thing is that I've never seen a cinematic incarnation of Bill Gates portrayed on the silver screen. I suppose that this is something to do with a perceived lack of charisma. I don't pity Bill, however, as no matter what your thoughts on Microsoft's innovation, style or business practices he and his company have been hugely influential for the last 20 years.
People who try to ridicule him for the infamous quite about 640K of RAM or the failed Tablet PC just don't seem to get that he is probably the most successful opportunist who has ever lived.
The Beast in the East
The other night I was watching a TV show about Rivers (the BBC one with Gryff Rhys Jones) and apart from admiring the beautiful scenery a recurring point of the show is that rivers and canals were the lifeblood of the Industrial Revolution, much in the way that roads were in the 20th Century and copper (for broadband) is in the Information Age. As such innovation flourised along these waterways. Indeed, something that always amazes me is just how much innovation sprang from the small island of my birth - the UK - and this is where the US picked up the during the early 20th Century.
The other day I was having a discussion at lunch with a friend who is an senior australian academic and the I questioned whether any of the emerging economic powers, namely India or China, was poised to pick up the innovation chalice from the US. For the record he has travelled and dealt with major research organisations in both the US and China. His answer was categoric and a little bit worrying.
Within a generation he believes that China will be the only engine of innovation that matters, something far removed from its current role as the world's factory. This makes me wonder what will become of Silicon Valley in twenty or thirty years time and I suppose bring the current spat between the US, Google and China into some perpective.
The other day I was having a discussion at lunch with a friend who is an senior australian academic and the I questioned whether any of the emerging economic powers, namely India or China, was poised to pick up the innovation chalice from the US. For the record he has travelled and dealt with major research organisations in both the US and China. His answer was categoric and a little bit worrying.
Within a generation he believes that China will be the only engine of innovation that matters, something far removed from its current role as the world's factory. This makes me wonder what will become of Silicon Valley in twenty or thirty years time and I suppose bring the current spat between the US, Google and China into some perpective.
Monkey See Monkey Do
I remember seeing a Rodney Dangerfield film in the 1990's which was pretty much forgettable except for one scene. The scene was set in a typical university ampitheatre with a lecturer and say 200 students. Over time as the weeks passed many of the students opted to skip the lectures but record them on a tape recorders, until finally even the lecturer himself failed to show and simply played an audio recording of the lecture to the non existant students.
So what's the connection with IT. I worked with some Oracle PreSales in the 1990's. They were always a pretty busy bunch responding to RFI/RFP's and doing demos. At the time Oracle has a graduate recruitment program and one of the grads I knew was on secondment to PreSales.
Chatting one day I asked what she was doing and she said that she was responding to an Oracle Financials RFP. I was impressed, actually more baffled, that given their lack of experience that they were entrusted with this work that would normally be entrusted to a senior Financials PreSales resource. Digging a bit deeper the answer became obvious. I found out that the grad was pretty much cutting and pasting answers from previous RFP responses into the new one.
How could they do this? Easily, really, becuase the RFP's at the time we being generated by grads sitting in the Big Six consultancies and contained a random subset of the stock RFP Financials questions they had.
The difference is that our grads were working for free (chasing new business). Somehow I don't think that the client would be paying the Big Six consultancy grad rates for their RFP work.
So what's the connection with IT. I worked with some Oracle PreSales in the 1990's. They were always a pretty busy bunch responding to RFI/RFP's and doing demos. At the time Oracle has a graduate recruitment program and one of the grads I knew was on secondment to PreSales.
Chatting one day I asked what she was doing and she said that she was responding to an Oracle Financials RFP. I was impressed, actually more baffled, that given their lack of experience that they were entrusted with this work that would normally be entrusted to a senior Financials PreSales resource. Digging a bit deeper the answer became obvious. I found out that the grad was pretty much cutting and pasting answers from previous RFP responses into the new one.
How could they do this? Easily, really, becuase the RFP's at the time we being generated by grads sitting in the Big Six consultancies and contained a random subset of the stock RFP Financials questions they had.
The difference is that our grads were working for free (chasing new business). Somehow I don't think that the client would be paying the Big Six consultancy grad rates for their RFP work.
Stuff I'd like to see in iPhone OS 4
There's lots of speculation about what's coming down the pipe in iPhone OS 4. Most of this is related to multitasking, which the current lack of I've never had a problem with.
Some minor features I'd like to see though are:
1) Location aware wi-fi setting. Unlike many world cities the wifi coverage here in Sydney, Australia is poor. For that reason I tend to have wifi enabled enabled at home and off when out or at work to improve battery life. This means that every day I toggle wifi on and off on my iPhone. Often I forget and am reminded when wifi asks me to join a new network when I'm using Safari. This is a minor annoyance. It would be great if I could make the toggling on/off of wifi automatic based upon a configurable location setting (using AGPS) to say my home or office locations for trusted networks only. When outside of those locations by say 500 metres it would be nice if wifi automatically switched off.
2) I'd like to see some integration between the Calendar and the Contacts in iPhone OS. For example, if I'm setting up an appointmemnt with my dentist currently I either have to enter a title or location but in reality that information I need for this appointment is already stored in my Contacts. If I could make a new appointment and just select a contact (or multiple contacts) rather than having to type in a title or location this would be great.
I don't pretend that either of these suggestions is radical but it is small eveloutionary steps like them that will keep iPhone OS ahead of the rest of the pack.
As previously stated longer term I'm obviously expecting the smartphone to morph into an identity based device that will integrate with my house and car security. I also fully expect it to morph into a credit/payment device. Long term I'd love for my phone to replace the wallet in my pocket (say within 10 years) but for now I'd be happy with the above.
Some minor features I'd like to see though are:
1) Location aware wi-fi setting. Unlike many world cities the wifi coverage here in Sydney, Australia is poor. For that reason I tend to have wifi enabled enabled at home and off when out or at work to improve battery life. This means that every day I toggle wifi on and off on my iPhone. Often I forget and am reminded when wifi asks me to join a new network when I'm using Safari. This is a minor annoyance. It would be great if I could make the toggling on/off of wifi automatic based upon a configurable location setting (using AGPS) to say my home or office locations for trusted networks only. When outside of those locations by say 500 metres it would be nice if wifi automatically switched off.
2) I'd like to see some integration between the Calendar and the Contacts in iPhone OS. For example, if I'm setting up an appointmemnt with my dentist currently I either have to enter a title or location but in reality that information I need for this appointment is already stored in my Contacts. If I could make a new appointment and just select a contact (or multiple contacts) rather than having to type in a title or location this would be great.
I don't pretend that either of these suggestions is radical but it is small eveloutionary steps like them that will keep iPhone OS ahead of the rest of the pack.
As previously stated longer term I'm obviously expecting the smartphone to morph into an identity based device that will integrate with my house and car security. I also fully expect it to morph into a credit/payment device. Long term I'd love for my phone to replace the wallet in my pocket (say within 10 years) but for now I'd be happy with the above.
Sunday, March 14, 2010
But you can polish a Turd!
A old friend of mine frequently uses the phrase "You can't polish a turd". Until this week I believed this to be true until today.
We're four years and $140m into a BPR project that has a major SAP implementation at its heart. The intesting thing is that ten days after Phase 1 go live, and 1,500 related service desk calls later, the message I'm getting in the crucial run up to the end of financial year is that large parts of Finance simply can no longer do their day to day jobs.
Funny that because the SAP Implementation Program Manager's status email today doesn't really correlate with what I'm hearing on the ground.
Oh well. I suppose someone's just found that special way of polishing.
We're four years and $140m into a BPR project that has a major SAP implementation at its heart. The intesting thing is that ten days after Phase 1 go live, and 1,500 related service desk calls later, the message I'm getting in the crucial run up to the end of financial year is that large parts of Finance simply can no longer do their day to day jobs.
Funny that because the SAP Implementation Program Manager's status email today doesn't really correlate with what I'm hearing on the ground.
Oh well. I suppose someone's just found that special way of polishing.
Thursday, March 11, 2010
The OS of tomorrow
In my early days of computing at school, long before the dawn of Windows, most machines (Apple 2's, BBC Micros, Commodoore PETs, etc.) booted up directly into a BASIC interpreter prompt. This meant that you never ever had to interact with the computers operating system which was entirely hidden from view. Then I went to Uni and was introduced to the world of multiuser computer systems and compilers (namely VAX VMS and Pascal). Soon after PC's running MS-DOS and later Windows started landing on our desktops and the rest is history. The only thing is that I'm wondering if we didn't miss a trick here. For nearly twenty years we've been interacting with an Operating System (i.e. Windows, Mac, Linux, etc.) and a hierarchical directory filesystem and the question I'm asking is why?
I suspect that the reason is a legacy hangover. Devices like the Palm Pilot reverted back to the old model of hiding the working of the O/S from public inspection and this trend continues with the likes of iPhone OS, Android and Google Chrome OS. I guess that's one of the reasons why Microsoft are chasing clouds so much at the moment.
I suspect that the reason is a legacy hangover. Devices like the Palm Pilot reverted back to the old model of hiding the working of the O/S from public inspection and this trend continues with the likes of iPhone OS, Android and Google Chrome OS. I guess that's one of the reasons why Microsoft are chasing clouds so much at the moment.
Labels:
Cloud Computing,
Operating Systems,
Programming
The programming language of tomorrow
One programming language I never used was IBM's PL/1, which stands for Programming Language One. IBM billed this langauge as the only computer language you would ever need to learn. Obviously that never materialised but today's question is why PL/1 or another language has never become dominant?
For the record BASIC was my first programming language and all in all it wasn't a bad introduction. It had all the fundamentals that we still work with today (variables, constants, subroutines, expressions, loops, arrays, inputs, outputs, etc.). Then at Uni I learned Pascal followed by COBOL in the workplace. Since then there have been a plethora of 4GL's and GUI based IDE's and scripting languages have come and gone. The list almost seems endless with the only constant being native SQL.
I want to get back to my computing roots and learn a 3GL like language to start developing software as a hobby, but the choice of languages is bewildering. For example off the top of my head there's C, C++, VB .Net, Java, C#, Javascript, Perl, Python, PHP, iPhone SDK, Android SDK, etc. The list goes on. I suspect that just as the server and database business has rationalised over the last few years we are due for some pruning of computer languages. So long as they are archived for posterity - I'm hoping so anyway.
For the record BASIC was my first programming language and all in all it wasn't a bad introduction. It had all the fundamentals that we still work with today (variables, constants, subroutines, expressions, loops, arrays, inputs, outputs, etc.). Then at Uni I learned Pascal followed by COBOL in the workplace. Since then there have been a plethora of 4GL's and GUI based IDE's and scripting languages have come and gone. The list almost seems endless with the only constant being native SQL.
I want to get back to my computing roots and learn a 3GL like language to start developing software as a hobby, but the choice of languages is bewildering. For example off the top of my head there's C, C++, VB .Net, Java, C#, Javascript, Perl, Python, PHP, iPhone SDK, Android SDK, etc. The list goes on. I suspect that just as the server and database business has rationalised over the last few years we are due for some pruning of computer languages. So long as they are archived for posterity - I'm hoping so anyway.
Tuesday, March 9, 2010
Where did it all go wrong?
When asked about losing his fortune soccer legend George Best is quoted as replying "I spent a lot of money on booze, birds and fast cars. The rest I just squandered". I'd be asking similar question of a company called Palm Inc. right now - only I don't think the answer would be anywhere near as satisfying. Palm's recent results were below market expectations and according to reports they are in the red and burning through their last $500m at the bank.
Back in 1996 Palm introduced a revolutionary low cost PDA device - the Palm Pilot - and it sold like hotcakes. Palm OS was a revelation with its simple and intuitive gui and grafitti was a pretty good stab at handwriting recognition too. The Palm Pilots influence can still be seen in devices like the iPhone today (large screen, no physical keyboard, few physical buttons, downloadable applications, pc syncing, etc). In fact when you think about it the only real difference between the Palm OS and iPhone OS is the replacement of the stylus with multitouch.
Then things started to go wrong for Palm:
- The brains trust left, came back and then and left again.
- The Treo capitulated and added keyboards, aping Blackberry's success, instead of building on the Palm Pilots keyboardless form factor.
- Palm OS stagnated whilst a new Linux based OS churned in development hell. Palm even commited herecy and lisensced Windows Mobile on some of their devices
- The ill-conceived Foleo debacle
The real irony here is that after all the issues related to the stagnation of Palm OS is that they really do seem to have come up with a great successor in WebOS. I might have even been tempted by the Palm Pre it it had been sold here in Australia. I for one hope that Palm can survive and prosper but that seems an unlikely outcome. The best I think we can hope for is that one of the big handset manufacturers becomes disillusioned with Android and buys the company for its IP so that WebOS can live on.
Back in 1996 Palm introduced a revolutionary low cost PDA device - the Palm Pilot - and it sold like hotcakes. Palm OS was a revelation with its simple and intuitive gui and grafitti was a pretty good stab at handwriting recognition too. The Palm Pilots influence can still be seen in devices like the iPhone today (large screen, no physical keyboard, few physical buttons, downloadable applications, pc syncing, etc). In fact when you think about it the only real difference between the Palm OS and iPhone OS is the replacement of the stylus with multitouch.
Then things started to go wrong for Palm:
- The brains trust left, came back and then and left again.
- The Treo capitulated and added keyboards, aping Blackberry's success, instead of building on the Palm Pilots keyboardless form factor.
- Palm OS stagnated whilst a new Linux based OS churned in development hell. Palm even commited herecy and lisensced Windows Mobile on some of their devices
- The ill-conceived Foleo debacle
The real irony here is that after all the issues related to the stagnation of Palm OS is that they really do seem to have come up with a great successor in WebOS. I might have even been tempted by the Palm Pre it it had been sold here in Australia. I for one hope that Palm can survive and prosper but that seems an unlikely outcome. The best I think we can hope for is that one of the big handset manufacturers becomes disillusioned with Android and buys the company for its IP so that WebOS can live on.
Labels:
Android,
Apple,
Pail Pilot,
Palm,
PDA,
Smartphone
Monday, March 8, 2010
Keep it Simple Stupid or Keep it Stupid Simpleton
There's an urban myth that says NASA spend millions designing a pen that would work in space whilst the Russians got by with pencils. This is a perfect example of KISS. In reality the pen was developed cheaply by an independent contractor and sold to both NASA and the Soviet Space Agency.This myth and its ilk often used to be quoted when comparing the Cold War forces pitted against each other. By all accounts the MiG 29 is very low tech when compared to the F-15 Eagle for example and there was a time in the 80's when supposedly F-15's were only serviceable every 4 days out of 10 because of their inherent complexity.
This phenomenon has parallels with some of the choices we face in modern day IT. During the internet boom I was engaged bya dotcom to define a server architecture for their new revised classifieds website. The brief was that it needed to be scalable and have a very high availability. I provided the client with three options at the time which were:
- simple - a single highly resilient server with no failover capacity
- moderate - a hardware based failover cluster which required a hardware reboot to failover between nodes
- complex - a hardware/software based load balanced cluster which could handle a gracious soft failover
Being accustomed to having to cost justify these options I fully expected the client to select the simple or moderate solution. To my utter surprise they didn't blink an eyelid as went for the most expensive and complex option. They perceieved that the risk of an unplanned outage was so serious that they would literally spend millions to guard against the risk. They also explained that the sort of kudos they would get in the industry by following the likes of Amazon and announcing an infrastructure purchase in excess of a million dollars would enhance their perceived market valuation pre IPO.
In hindsight, I wish I had never proposed the complex option or that I should have been more vociferous in opposing it. Previously I'd had great success in applying the moderately complex model at previous sites. However, these were heady times and so long as the client understood the risks I was prepared to give it a go. The result, despite all my best efforts, was a disaster. We had severe performance issues from day one and our reliability was worse than it would have been had we picked either of the simpler options. It was humbling for me and a great reminder to me to always apply KISS.
The interesting thing though is that if we don't ever try to stretch ourselves now and again we'd all be building Mig 29's. For the record the F-15 supposedly has a 104 to 0 kill ratio.
This phenomenon has parallels with some of the choices we face in modern day IT. During the internet boom I was engaged bya dotcom to define a server architecture for their new revised classifieds website. The brief was that it needed to be scalable and have a very high availability. I provided the client with three options at the time which were:
- simple - a single highly resilient server with no failover capacity
- moderate - a hardware based failover cluster which required a hardware reboot to failover between nodes
- complex - a hardware/software based load balanced cluster which could handle a gracious soft failover
Being accustomed to having to cost justify these options I fully expected the client to select the simple or moderate solution. To my utter surprise they didn't blink an eyelid as went for the most expensive and complex option. They perceieved that the risk of an unplanned outage was so serious that they would literally spend millions to guard against the risk. They also explained that the sort of kudos they would get in the industry by following the likes of Amazon and announcing an infrastructure purchase in excess of a million dollars would enhance their perceived market valuation pre IPO.
In hindsight, I wish I had never proposed the complex option or that I should have been more vociferous in opposing it. Previously I'd had great success in applying the moderately complex model at previous sites. However, these were heady times and so long as the client understood the risks I was prepared to give it a go. The result, despite all my best efforts, was a disaster. We had severe performance issues from day one and our reliability was worse than it would have been had we picked either of the simpler options. It was humbling for me and a great reminder to me to always apply KISS.
The interesting thing though is that if we don't ever try to stretch ourselves now and again we'd all be building Mig 29's. For the record the F-15 supposedly has a 104 to 0 kill ratio.
Sunday, March 7, 2010
The Big Six
I have previously worked as a Consultant and in both cases they have been for software houses. I always felt that if you were going to pimp out yourself then you might aswell have the inside track and worked for the company that wrote the software. Despite the supposed prestige associated with them I was never ever tempted to work for any of the then so called 'Big Six' (now known as the Final Four) consulting firms as I don't believe that our personalities would have been compatable.
Why? If you had witnessed some of the things I've seen the Big Six get away with you'd understand. Some classics include:
- A 'Big Six' female PM lodging a sexual misconduct complaint against a male client PM who sought to challenge the success of a major project. I can't say whether the complaint had any grounds but it was clear that it would never have seen the light of day had the client PM kept his mouth shut.
- A Big Six firm getting a project I was working on shutdown because we were close to delivering more in 3 months than they had in two years and promising to finish out Project within a month. This one ended up in litigation I'm pleased to say.
- Rigging RFI scoring to pick the most expensive packages to implement rather than those that best fit the client requirements.
- Attending onsite training courses with the client staff and then advising the client on configuration against the advice of the consultant trainers.
- An Big Six 'oracle development practice' that was running around Sydney in the 90's developing applications. Only was that they'd never discovered what an Index was!
- Big Six consultants sitting onsite doing a online SQL training course whilst billing a grand a day.
I'm sure there are lot smore examples out there and I'm not saying that the consultancies I worked for were whiter than white but there's a long way between them and what I've seen of the Big Six.
Why? If you had witnessed some of the things I've seen the Big Six get away with you'd understand. Some classics include:
- A 'Big Six' female PM lodging a sexual misconduct complaint against a male client PM who sought to challenge the success of a major project. I can't say whether the complaint had any grounds but it was clear that it would never have seen the light of day had the client PM kept his mouth shut.
- A Big Six firm getting a project I was working on shutdown because we were close to delivering more in 3 months than they had in two years and promising to finish out Project within a month. This one ended up in litigation I'm pleased to say.
- Rigging RFI scoring to pick the most expensive packages to implement rather than those that best fit the client requirements.
- Attending onsite training courses with the client staff and then advising the client on configuration against the advice of the consultant trainers.
- An Big Six 'oracle development practice' that was running around Sydney in the 90's developing applications. Only was that they'd never discovered what an Index was!
- Big Six consultants sitting onsite doing a online SQL training course whilst billing a grand a day.
I'm sure there are lot smore examples out there and I'm not saying that the consultancies I worked for were whiter than white but there's a long way between them and what I've seen of the Big Six.
Labels:
Consulting,
Oracle,
The Big Six,
The Final Four
Wednesday, March 3, 2010
More jokes at Larry's expense
Don't get me wrong - I think Oracle are a great company and I enjoyed working for them immensely. It's just that if you're going to work for them you need to develop a pretty thick skin. Besides which I think Larry can weather a joke or two.
So here are the best two jokes that used to do the rounds when I worked at the Big O.
"What do you call an optimistic Oracle Sales Rep?". Somebody who irons five shirts at the start of the week.
"What's the best platform that Oracle runs on?". Powerpoint.
So here are the best two jokes that used to do the rounds when I worked at the Big O.
"What do you call an optimistic Oracle Sales Rep?". Somebody who irons five shirts at the start of the week.
"What's the best platform that Oracle runs on?". Powerpoint.
Thin is the new Fat
Back in the mid 90's and arguably towards the end of the client server period there was a debate still being had on the relative merits of Thin versus Fat Client.
At the time Oracle Apps has just introduced 10SC (a gui fat client) replacing the previous green screen block character interface. I remember one PreSales technical slide from an Oracle Applications presentation in particular extolling the virtues of Fat Client over the three-tier Thin Client model that was used by SAP. The thin client was even given the derogatory label as a 'screenscraper'.
The funny thing is that just a short while later after the IT world had taken the browser to its heart I saw the same Tech PreSales giving the same presentation and giving exactly the opposite message with regard to Thin client after Oracle Apps has adopted the model.
So how do you know when an Oracle PreSales presenter is lying? Probably when their lips are moving.
At the time Oracle Apps has just introduced 10SC (a gui fat client) replacing the previous green screen block character interface. I remember one PreSales technical slide from an Oracle Applications presentation in particular extolling the virtues of Fat Client over the three-tier Thin Client model that was used by SAP. The thin client was even given the derogatory label as a 'screenscraper'.
The funny thing is that just a short while later after the IT world had taken the browser to its heart I saw the same Tech PreSales giving the same presentation and giving exactly the opposite message with regard to Thin client after Oracle Apps has adopted the model.
So how do you know when an Oracle PreSales presenter is lying? Probably when their lips are moving.
Labels:
Fat Client,
Oracle,
Software Sales,
Thin Client
My Year of Living Dangerously
For exactly one year I gave up being technical and concentrated on the client relations side of the IT business. I remember the year exactly because FY96 (mid 1995-1996) was possibly the worst year of my professional life. FY96 was going to be Oracle's year of Customer Service.
I was picked out as having the prerequisite soft skills to work as one of twelve ambassadors within the customer base with the aim of improving overall referenceability. I was flattered as it meant reporting to a Director and I saw it as a great opportunity. Ultimately, however, the role came undone because:
- Halfway through the financial year it looked like Oracle Australia wasn't going to make its country numbers. Customer focus was dropped like a hot potato and sales targets re-emerged as the only real measure of success. My Director was given another portfolio related to Support Sales and given targets.
- Support and Sales could never agree on what our role was supposed to be about. Support (my organisation) wanted to us focus on the Tier 1 accounts and Sales wanted us to work as a Tier 3 Account Manager.
The long and the short of it was that whilst it was a poor year for me professionally I did gain a valuable insight into the inner workings of the sales organisation of a very successful software house.
It's easy to write of Salespeople as low, unscrupulous, pond life but that would be a generalisation and incorrect in many cases. The really successfull account managers were smart, driven and hard working. Were they honest? Well that's the 64 million dollar question. I prefer to think that they inhabit a world of greys rather than black and white. After a year of swimming with the sharks let's just say it was a relief to re-enter my technical world again. Binary, after all is pretty much a black and white sort of thing.
I was picked out as having the prerequisite soft skills to work as one of twelve ambassadors within the customer base with the aim of improving overall referenceability. I was flattered as it meant reporting to a Director and I saw it as a great opportunity. Ultimately, however, the role came undone because:
- Halfway through the financial year it looked like Oracle Australia wasn't going to make its country numbers. Customer focus was dropped like a hot potato and sales targets re-emerged as the only real measure of success. My Director was given another portfolio related to Support Sales and given targets.
- Support and Sales could never agree on what our role was supposed to be about. Support (my organisation) wanted to us focus on the Tier 1 accounts and Sales wanted us to work as a Tier 3 Account Manager.
The long and the short of it was that whilst it was a poor year for me professionally I did gain a valuable insight into the inner workings of the sales organisation of a very successful software house.
It's easy to write of Salespeople as low, unscrupulous, pond life but that would be a generalisation and incorrect in many cases. The really successfull account managers were smart, driven and hard working. Were they honest? Well that's the 64 million dollar question. I prefer to think that they inhabit a world of greys rather than black and white. After a year of swimming with the sharks let's just say it was a relief to re-enter my technical world again. Binary, after all is pretty much a black and white sort of thing.
Tuesday, March 2, 2010
What no disk?
At the tail end of my time working for the big O Larry introduced the Network Computer. From memory it was the talk of the town at Oracle Openworld 1997. I remember being in a client briefing with a senior Oracle Marketing VP telling the client that in two years time they would have replaced their PC's with NC's.
I have to say that I bought into the concept of the NC and so did Sun and IBM (with their respective Javastation and Netstation). I'd recently finished a site visit for a large telco who had basically locked down their SOE PC build so that the PC barely used the local C: drive anyway. The logical extension of this was to produce a computer that would boot directly from the network and store eveything on network drives.
Funnily enough it wasn't any theoretical limitations that did for the NC but more than when it was released the price of PC's dropped from thousands of dollars to hundreds of dollars undermining any potential cost savings that Network Computers may have introduced. Shame really becuase I still think the idea had legs.
I have to say that I bought into the concept of the NC and so did Sun and IBM (with their respective Javastation and Netstation). I'd recently finished a site visit for a large telco who had basically locked down their SOE PC build so that the PC barely used the local C: drive anyway. The logical extension of this was to produce a computer that would boot directly from the network and store eveything on network drives.
Funnily enough it wasn't any theoretical limitations that did for the NC but more than when it was released the price of PC's dropped from thousands of dollars to hundreds of dollars undermining any potential cost savings that Network Computers may have introduced. Shame really becuase I still think the idea had legs.
Labels:
IBM,
Network Computer,
Oracle,
PC,
Sun Microsystems
Electic dreams
The early days of personal computer in the US may have been dominated by the likes of Apple II, the Tandy TRS-80 the Commodore Vic-20 but in the UK there were a number of other suppliers that preceded the days of the IBM PC. These were the likes of the Sinclair, Atom (BBC Micro), and Research Machines (making early CP/M based PC's).
Of these the first computer I ever owned was a second hand Sinclair ZX81 and I wish I had perhaps persevered with it a bit more. My problem was that I had no way of storing any programs I'd written, something at the time which was normally done on cassette tape. This meant that I had to type each and every program in fresh every time and anyone who ever struggled with the ZX81's rubbery keypad would understand just how frustrating this could be and just how limited the results for 20 minutes of typing could be. Having said that I still have fond memories of the old ZX81 and plugging away writing simple BASIC programs.
The most remarkable thing about the machine was it's low cost - about fifty pounds. Just imagine if the personal computer had evolved more along the lines of the ZX81 instead of the expensive beiges boxes costing thousand of dollars than invaded our desktops and homes instead.
In a recent Guardian article it came to light that Sir Clive Sinclair, the UK's personal computing pioneer, does not actually use a PC as he believes that they are wasteful (in memory and CPU cycles), take ages to boot and that he'd rather pick up the phone than communicate via email. Funnily enough these are thoughts I've already expressed in previous entries in this blog.
Anyway, hat's off to Sir Clive. A real thinker and a guenuine innovator.
Of these the first computer I ever owned was a second hand Sinclair ZX81 and I wish I had perhaps persevered with it a bit more. My problem was that I had no way of storing any programs I'd written, something at the time which was normally done on cassette tape. This meant that I had to type each and every program in fresh every time and anyone who ever struggled with the ZX81's rubbery keypad would understand just how frustrating this could be and just how limited the results for 20 minutes of typing could be. Having said that I still have fond memories of the old ZX81 and plugging away writing simple BASIC programs.
The most remarkable thing about the machine was it's low cost - about fifty pounds. Just imagine if the personal computer had evolved more along the lines of the ZX81 instead of the expensive beiges boxes costing thousand of dollars than invaded our desktops and homes instead.
In a recent Guardian article it came to light that Sir Clive Sinclair, the UK's personal computing pioneer, does not actually use a PC as he believes that they are wasteful (in memory and CPU cycles), take ages to boot and that he'd rather pick up the phone than communicate via email. Funnily enough these are thoughts I've already expressed in previous entries in this blog.
Anyway, hat's off to Sir Clive. A real thinker and a guenuine innovator.
Madness in the Method
In the UK there was a famous 80's TV commercial for British Telecom starring Maureen Lipman (http://centuryads.blogspot.com/2007/01/you-got-ology-1987-launch-of-bt-beattie.html). She plays a stereotypical Jewish grandmother who upon discovery that her gransdon has flunked all his exams except Pottery and Sociology replies "He gets an ology and he says he's failed... you get an ology you're a scientist." Maybe, just maybe, Andrew (the grandson) didn't become a scientist but got another ology ... a methodology instead.
This would tie in with how many methodologies are used (and abused?) in today's IT. From what I seen methodologies are used as a crutch by below standard IT consulting organisations/individuals to present a veneer of professional competency and mask the lack of any real ability.
Don't get me wrong, there's nothing inherently wrong with a good methodology as far as they go, but don't for one minute believe that an ology will somehow replace common sense, experience and good judgement.
This would tie in with how many methodologies are used (and abused?) in today's IT. From what I seen methodologies are used as a crutch by below standard IT consulting organisations/individuals to present a veneer of professional competency and mask the lack of any real ability.
Don't get me wrong, there's nothing inherently wrong with a good methodology as far as they go, but don't for one minute believe that an ology will somehow replace common sense, experience and good judgement.
Labels:
Consulting,
IT,
Methodology,
Software Development
Sunday, February 28, 2010
Back to the Future
Back in the post 'The IT Wars' I referred to an RDBMS war that was fought between Oracle and Ingres. In reality competition in the RDBMS space has always been a little more complex than that.
The Oracle/Ingres battle just happended to be the main one at play when I entered the fray. FYI - my first RDBMS was Ingres and I have to say that from the developers perspective it was a far superior product to Oracle. Ultimately it is recognised that Oracle won out because of superior sales and marketing prowess, although this simplistic argument undermines the fact that Oracle was the more reliable and scalable database (row vs page level locking anyone?).
In reality Oracle has seen off the following RDBMS competition over the years:
- Ingres in the late 80's
- Sybase in the early 90's
- Informix in the late 90's
Along the way the big O has also also gobbled up Rdb and MySQL databases.
The interesting thing is that when I worked for Larry in the mid to late 90's Oracle always saw the following two as being their major database rivals:
- IBM with DB2
- Microsoft with SQL Server
So it comes as no great surprise that with the acqusition of Sun Microsystems and the plans that Oracle has for its Database Machine that the likes of IBM and Microsoft are scathing in their response calling it a return to the bad old days of the 1960's.
So how does the combination of software/hardware affect the big players:
Microsoft - SQL Server runs on any platform so long as its Windows/x86 so no change there.
IBM - The various forms of DB/2 run on various hardware platforms - so long as they're IBM that is. Again no change.
Oracle - I'm struggling to see a downside here. Larry has picked up Sun at pretty much a rock bottom price and with the new capability can attack integrated hardware/software threats from IBM, Netezza and Teradata who have all espoused the integrated harware/software solution. The combination of hardware and software is not unlike that taken by Apple, but it doesn't preclude Oracle's regular business of selling database on pretty muuch every major platform out there.
He's also picked up Java which was the lynchpin of Oracle's development platform anyway and moved a long way to removing the potential threat from Open Source by picking up the most successful of the Open Source databases in MySQL. In short, it's a win win because it's unlikely that he will alienate any of the existing hardware partners as they know that they cannot ignore Oracle.
Does it herald a return to the heyday of the 1960's, however, or for anyone following this blog, does that mean that we can look forward to fully integrated corporate data model as I've referred to in a few prior posts. Not likely. When Larry refers to an integrated server platform I think you have to look deeper than the marketing blurb to understand that apart from improvements to the systems management space and perhaps better performance integration there really isn't anything new here. Im not saying that the Database Machine is a bad concept, but I'd be wary in thinking that it herald's a return to when life was simple.
The Oracle/Ingres battle just happended to be the main one at play when I entered the fray. FYI - my first RDBMS was Ingres and I have to say that from the developers perspective it was a far superior product to Oracle. Ultimately it is recognised that Oracle won out because of superior sales and marketing prowess, although this simplistic argument undermines the fact that Oracle was the more reliable and scalable database (row vs page level locking anyone?).
In reality Oracle has seen off the following RDBMS competition over the years:
- Ingres in the late 80's
- Sybase in the early 90's
- Informix in the late 90's
Along the way the big O has also also gobbled up Rdb and MySQL databases.
The interesting thing is that when I worked for Larry in the mid to late 90's Oracle always saw the following two as being their major database rivals:
- IBM with DB2
- Microsoft with SQL Server
So it comes as no great surprise that with the acqusition of Sun Microsystems and the plans that Oracle has for its Database Machine that the likes of IBM and Microsoft are scathing in their response calling it a return to the bad old days of the 1960's.
So how does the combination of software/hardware affect the big players:
Microsoft - SQL Server runs on any platform so long as its Windows/x86 so no change there.
IBM - The various forms of DB/2 run on various hardware platforms - so long as they're IBM that is. Again no change.
Oracle - I'm struggling to see a downside here. Larry has picked up Sun at pretty much a rock bottom price and with the new capability can attack integrated hardware/software threats from IBM, Netezza and Teradata who have all espoused the integrated harware/software solution. The combination of hardware and software is not unlike that taken by Apple, but it doesn't preclude Oracle's regular business of selling database on pretty muuch every major platform out there.
He's also picked up Java which was the lynchpin of Oracle's development platform anyway and moved a long way to removing the potential threat from Open Source by picking up the most successful of the Open Source databases in MySQL. In short, it's a win win because it's unlikely that he will alienate any of the existing hardware partners as they know that they cannot ignore Oracle.
Does it herald a return to the heyday of the 1960's, however, or for anyone following this blog, does that mean that we can look forward to fully integrated corporate data model as I've referred to in a few prior posts. Not likely. When Larry refers to an integrated server platform I think you have to look deeper than the marketing blurb to understand that apart from improvements to the systems management space and perhaps better performance integration there really isn't anything new here. Im not saying that the Database Machine is a bad concept, but I'd be wary in thinking that it herald's a return to when life was simple.
Labels:
Database Machine,
DB2,
IBM,
Microsoft,
Oracle,
SQL Sever,
Sun Microsystems
Thursday, February 25, 2010
Check, check, check and check!
Part 3 of a 3 part post on the role of the modern smartphone
In the previous two posts I've explored the impact of the smartphone on our lives. The first post explored my hopes and aspirations for a convergence device ten years ago and how the current smartphone vastly exceeds them. The second post explored what devices it has definitely replaced or is likely to replace in the near future. In the final post I wanted to explore what's left for the smartphone to achieve. Now I'm not pretending to be a futurologist and the potential applications for the smartphone are almost limitless. Why else have Google and Apple branded themselves as mobile devices companies when neither was their core competency less than a decade ago?
The focus of my smartphone predictions won't be around the likes of location services, enhanced reality, advertising and mobile TV as many are predicting, it's something far more mundane but essential to our daily lives.
Before I do this lets explore something that I do, and I'm guessing I'm not alone here, every day before I leave the house and go to work:
Check, check, check and check!
- Have I got my wallet? Check.
- Have I got my house keys? Check.
- Have I picked up my loose change? Check.
- Have I got my mobile phone? Check.
(Note that in my case I don't drive to work but if I did I'd obviously check for my car keys too).
So could the smartphone change things here? Well when you examine all these things they resolve down to two fundamentals of life: identity and money.
In my wallet is my drivers licence (a defacto identity card), my prepaid bus ticket (currently anonymous, but in other cases not so like the London Oyster Card), my work entry card (identity), some cash (money), credit and debit cards (money) and a lottery ticket (potential money). The other items in my pockets: loose change (money) and house keys (identity) and car keys (identity). Yes I know the keys don't strictly identify me but there is an implicit assumption that because I have the key I'm authorised to use it.
So can the smartphone morph and replace the above devices. Well technically yes. There's nothing revolutionary about the concept of using a smartphone for a payment device. Nokia were talking about this sort of stuff 10 years ago and I believe that many others are working on this technology right now. Lets also not forget that I can access our bank accounts never mind ebay and paypal on my phone and that we also already access our mobile phone accounts (either pre or postpay) every time we make a call or send a text. In short it doesn't take a huge leap of faith to believe that smartphones will become payment channels in the near future.
Looking at cars we are already starting to see keys be replaced by 'dongle-based' access devices. Is it such a stretch to see your car iPod/iPhone integration extending to cover car security too? The same could be said for home security too.
This leaves us with the rest of identity. For example can you imagine your driving licence being stored on your smartphone (or a cloud server accessed by your phone) If so what about your passport or even some future DNA based identity scheme? This is where I'm probably stretching the limits of how we will use smartphones in the future but not because of some inherent technical limitations, but bureaucratic ones. Could you imagine government departments accepting smartphones for ID. And I can already hear the din from the civil libertarians.
However, if the smartphone is going to evolve in these directions then we're going to need a pretty secure way for the device to identify us. Biometrics would definitely be required in the handset for this. We're also going to need some pretty secure encryption technology to ensure that nobody is hacking our accounts through the ether.
But the important concept here is that nothing mentioned in this post is beyond the bounds to reasonable technological advances in the next few years. Considering how far the smartphone has come in such a short space of time I'm beginning to see why Apple and Google are so interested in them.
In the previous two posts I've explored the impact of the smartphone on our lives. The first post explored my hopes and aspirations for a convergence device ten years ago and how the current smartphone vastly exceeds them. The second post explored what devices it has definitely replaced or is likely to replace in the near future. In the final post I wanted to explore what's left for the smartphone to achieve. Now I'm not pretending to be a futurologist and the potential applications for the smartphone are almost limitless. Why else have Google and Apple branded themselves as mobile devices companies when neither was their core competency less than a decade ago?
The focus of my smartphone predictions won't be around the likes of location services, enhanced reality, advertising and mobile TV as many are predicting, it's something far more mundane but essential to our daily lives.
Before I do this lets explore something that I do, and I'm guessing I'm not alone here, every day before I leave the house and go to work:
Check, check, check and check!
- Have I got my wallet? Check.
- Have I got my house keys? Check.
- Have I picked up my loose change? Check.
- Have I got my mobile phone? Check.
(Note that in my case I don't drive to work but if I did I'd obviously check for my car keys too).
So could the smartphone change things here? Well when you examine all these things they resolve down to two fundamentals of life: identity and money.
In my wallet is my drivers licence (a defacto identity card), my prepaid bus ticket (currently anonymous, but in other cases not so like the London Oyster Card), my work entry card (identity), some cash (money), credit and debit cards (money) and a lottery ticket (potential money). The other items in my pockets: loose change (money) and house keys (identity) and car keys (identity). Yes I know the keys don't strictly identify me but there is an implicit assumption that because I have the key I'm authorised to use it.
So can the smartphone morph and replace the above devices. Well technically yes. There's nothing revolutionary about the concept of using a smartphone for a payment device. Nokia were talking about this sort of stuff 10 years ago and I believe that many others are working on this technology right now. Lets also not forget that I can access our bank accounts never mind ebay and paypal on my phone and that we also already access our mobile phone accounts (either pre or postpay) every time we make a call or send a text. In short it doesn't take a huge leap of faith to believe that smartphones will become payment channels in the near future.
Looking at cars we are already starting to see keys be replaced by 'dongle-based' access devices. Is it such a stretch to see your car iPod/iPhone integration extending to cover car security too? The same could be said for home security too.
This leaves us with the rest of identity. For example can you imagine your driving licence being stored on your smartphone (or a cloud server accessed by your phone) If so what about your passport or even some future DNA based identity scheme? This is where I'm probably stretching the limits of how we will use smartphones in the future but not because of some inherent technical limitations, but bureaucratic ones. Could you imagine government departments accepting smartphones for ID. And I can already hear the din from the civil libertarians.
However, if the smartphone is going to evolve in these directions then we're going to need a pretty secure way for the device to identify us. Biometrics would definitely be required in the handset for this. We're also going to need some pretty secure encryption technology to ensure that nobody is hacking our accounts through the ether.
But the important concept here is that nothing mentioned in this post is beyond the bounds to reasonable technological advances in the next few years. Considering how far the smartphone has come in such a short space of time I'm beginning to see why Apple and Google are so interested in them.
Wednesday, February 24, 2010
How many devices does your iPhone replace?
Part 2 of a 3 part post on the role of the modern smartphone
In the previous post I reflected on how far the smartphone, in my case the ubiquitous iPhone but in reality it could be any smartphone, had come and how it had not only superceeded the mobile phone and the pda but also a plethora of other functions/devices. So lets look at personally what it's replaced or is likely to replace in the next few years in my scenario:
- PDA - will I ever buy another decicated PDA. Don't think so. Sorry Palm.
- iPod - again, I'm not sure I'll ever buy another dedicated music player nor an MP4 player. Bye Apple.
Sat Nav - Would I ever buy another dedicated Sat Nav device - probably not, although for true replacement technology I'd likely buy Tom Tom for iPhone. If I had a Google or Nokia smartphone this wouldn't be the case. Sorry Tom Tom.
- digital camera/camcorder - I'm not sure we'll replace our compact digicam when it dies. Maybe not the camcorder either. Sorry Sony and Canon. The digital SLR is, however, safe for now.
- Portable games console - The PSP barely get's any use and could well end up on ebay before the month is out. Sorry Sony.
- Portable DVD player - My better half wanted a portable DVD player so we could take the kiddie dvds with us when we travel. I've ripped and encoded this stuff onto my - iPhone and got a iPhone Composite cable for Christmas so we're pretty set on that front now too.
- Watch and Alarm Clock - this is probably not typical but for a few years I've stopped wearing a wristwatch and have relied on my phone for timekeeping.
It's amazing really that all this stuff has been or is likely to be replaced by the phone in my pocket. I'm not saying that the iPhone is better than any of these devices but in most cases its certainly good enough to replace many of the dedicated devices.
The other advantages are that it's always to hand and charged, unlike most of the other stuff which is in the cupboard most of the time out of battery.
In the previous post I reflected on how far the smartphone, in my case the ubiquitous iPhone but in reality it could be any smartphone, had come and how it had not only superceeded the mobile phone and the pda but also a plethora of other functions/devices. So lets look at personally what it's replaced or is likely to replace in the next few years in my scenario:
- PDA - will I ever buy another decicated PDA. Don't think so. Sorry Palm.
- iPod - again, I'm not sure I'll ever buy another dedicated music player nor an MP4 player. Bye Apple.
Sat Nav - Would I ever buy another dedicated Sat Nav device - probably not, although for true replacement technology I'd likely buy Tom Tom for iPhone. If I had a Google or Nokia smartphone this wouldn't be the case. Sorry Tom Tom.
- digital camera/camcorder - I'm not sure we'll replace our compact digicam when it dies. Maybe not the camcorder either. Sorry Sony and Canon. The digital SLR is, however, safe for now.
- Portable games console - The PSP barely get's any use and could well end up on ebay before the month is out. Sorry Sony.
- Portable DVD player - My better half wanted a portable DVD player so we could take the kiddie dvds with us when we travel. I've ripped and encoded this stuff onto my - iPhone and got a iPhone Composite cable for Christmas so we're pretty set on that front now too.
- Watch and Alarm Clock - this is probably not typical but for a few years I've stopped wearing a wristwatch and have relied on my phone for timekeeping.
It's amazing really that all this stuff has been or is likely to be replaced by the phone in my pocket. I'm not saying that the iPhone is better than any of these devices but in most cases its certainly good enough to replace many of the dedicated devices.
The other advantages are that it's always to hand and charged, unlike most of the other stuff which is in the cupboard most of the time out of battery.
Lets see how far we've come
Part 1 of a 3 part post on the role of the modern smartphone.
Back in the late 1990's I really wanted a mobile phone that was a true convergence device, something that meant I didn't have to carry around a mobile and a pda. At the time I had an Ericsson GF768 flip phone and Palm Pilot 3. Ultimately the single device that I wanted would be able to make calls, manage a consolidated contact list, scribble a few notes and maybe just maybe write an e-mail. WAP was the next big thing then so maybe access to the WAP web would be a nice to have.
These days any half decent smartphone will more than meet these needs and more but lets just consider how much more they can do. Today my iPhone gives me all of these incredible devices and applications in my pocket: phone, calendar, web browser, contacts, camera, video camera, audio recorder, texting, messaging & skype, IM & chat, access to social networking, email, music, movies, photo album, books, gps & maps and games. Oh and access to an almost unlimited suppy of applications too.
I may have had to wait ten years to get it but when it arrived my convergence device really did exceed my expectations. As Ferris Bueller says 'Life moves pretty fast. You don't stop and look around once in a while, you could miss it'.
Back in the late 1990's I really wanted a mobile phone that was a true convergence device, something that meant I didn't have to carry around a mobile and a pda. At the time I had an Ericsson GF768 flip phone and Palm Pilot 3. Ultimately the single device that I wanted would be able to make calls, manage a consolidated contact list, scribble a few notes and maybe just maybe write an e-mail. WAP was the next big thing then so maybe access to the WAP web would be a nice to have.
These days any half decent smartphone will more than meet these needs and more but lets just consider how much more they can do. Today my iPhone gives me all of these incredible devices and applications in my pocket: phone, calendar, web browser, contacts, camera, video camera, audio recorder, texting, messaging & skype, IM & chat, access to social networking, email, music, movies, photo album, books, gps & maps and games. Oh and access to an almost unlimited suppy of applications too.
I may have had to wait ten years to get it but when it arrived my convergence device really did exceed my expectations. As Ferris Bueller says 'Life moves pretty fast. You don't stop and look around once in a while, you could miss it'.
Thursday, February 18, 2010
Six Degrees of Preparation
Over a coffee the other day with a friend we were discussing what qualifications were most appropriate for a 21st Century IT professional. Obviously you would assume that a BSc in Computing or equivalent would rank highly, maybe even a BA in Information Technology, although at my Uni this were regarded as a soft option in IT degrees. Other studies related to New Media and Communications may also be relevant.
(FYI I studied on a Combined Sciences course of which one of my majors was Computer Sciences).
The funny thing is that when I got my first job in IT I was already COBOL trained so productive from pretty much day one. Not so for a couple of my new starter colleagues who read History and English Uni degree respectively. They embarked on a 6 month programming course.
IT is pretty unique in that way as it has alway been open to all comers. This is something that would be incoceivable in most other professions unless you had a relevant degree (i.e. Law, Medicine and Engineering) or even new fields like Biotechnology.
Back to the original question - what is the best degree. Well my mate believed that it would be a Law degree, followed by an MBA, because as we to oursourcing and cloud computing he believes that the only relevant skills for modern IT was drawing up and managing contracts.
That's a real shame.
(FYI I studied on a Combined Sciences course of which one of my majors was Computer Sciences).
The funny thing is that when I got my first job in IT I was already COBOL trained so productive from pretty much day one. Not so for a couple of my new starter colleagues who read History and English Uni degree respectively. They embarked on a 6 month programming course.
IT is pretty unique in that way as it has alway been open to all comers. This is something that would be incoceivable in most other professions unless you had a relevant degree (i.e. Law, Medicine and Engineering) or even new fields like Biotechnology.
Back to the original question - what is the best degree. Well my mate believed that it would be a Law degree, followed by an MBA, because as we to oursourcing and cloud computing he believes that the only relevant skills for modern IT was drawing up and managing contracts.
That's a real shame.
Tuesday, February 16, 2010
Dark Days at 1 Infinite Loop?
According to some recent techpress articles it's been a week to forget for Apple. Firstly, Redmond launched what appears to be their first viable alternative in the mobile devices space with Windows Phone Series 7 (catchy title, eh?) and then the rest of the also rans (excluding Google, RIM, Microsoft and Nokia) in the mobile phone industry announce the WAC as an alternative to the App Store.
Take this into account with some lacklustre reviews on the iPad and should we be calling an end to the Apple Renaissance?
Well, time will tell whether the Microsoft gamble will succeed but many pundits are already writing it off just like the Zune. Steve Jobs has used the Wayne Gretzky quote 'I skate to where the puck is going to be, not where it has been.' and I think Microsoft as skating at where the iPhone has been.
And as to the WAC check out http://www.theregister.co.uk/2010/02/16/app_stores_szzz/ for some thoughts. Also if the OviStore has to resort to giving away the Nokia Maps Application for free to get some traction then how will the WAC fare any better? Nokia reported that there were 3m dowloads of its Maps application last week. This compares to Apple's overall 3bn App Store downloads and 10bn iTunes downloads to put some context around these numbers.
The much more serious threat to Apple in the mobile space comes from Google with its Android platform, despite Eric Schmidt's low key address today at MWC. Android, and a plethora of mobile phone makers, could challenge the all-in-one hardware/software philisophy that Apple has today, just like like Windows and a plethora of PC makers did for the original Macintosh computer. However, there are real differences between the early PC days and the current smartphone wars. Namely:
- different pricing models are at play - Back in the 80's everyone agreed that the Mac was superior to the PC but cost twice as much. Most mobiles are bought on the back of network contracts so unless Android sets start significantly undercutting the iPhone then that doesn't apply here. Indeed given that the mobile phone is something of a status symbol a market awash with cheap Android clones may even be counterproductive.
- Smartphones are as much fashion statements as a utilitarian devices meaning that design of the handset, operating system and application softeware is crucial. Here is where I think Apple have the edge. The key principles at the core of Apple are superior design and controlling the entire end to end user experience.
Will Android overtake the iPhone - almost certainly, but will that be at the expense of Apple. I don't think so. I suspect that it will be Nokia, Microsoft and RIM have more to fear from this weeks announcements.
Take this into account with some lacklustre reviews on the iPad and should we be calling an end to the Apple Renaissance?
Well, time will tell whether the Microsoft gamble will succeed but many pundits are already writing it off just like the Zune. Steve Jobs has used the Wayne Gretzky quote 'I skate to where the puck is going to be, not where it has been.' and I think Microsoft as skating at where the iPhone has been.
And as to the WAC check out http://www.theregister.co.uk/2010/02/16/app_stores_szzz/ for some thoughts. Also if the OviStore has to resort to giving away the Nokia Maps Application for free to get some traction then how will the WAC fare any better? Nokia reported that there were 3m dowloads of its Maps application last week. This compares to Apple's overall 3bn App Store downloads and 10bn iTunes downloads to put some context around these numbers.
The much more serious threat to Apple in the mobile space comes from Google with its Android platform, despite Eric Schmidt's low key address today at MWC. Android, and a plethora of mobile phone makers, could challenge the all-in-one hardware/software philisophy that Apple has today, just like like Windows and a plethora of PC makers did for the original Macintosh computer. However, there are real differences between the early PC days and the current smartphone wars. Namely:
- different pricing models are at play - Back in the 80's everyone agreed that the Mac was superior to the PC but cost twice as much. Most mobiles are bought on the back of network contracts so unless Android sets start significantly undercutting the iPhone then that doesn't apply here. Indeed given that the mobile phone is something of a status symbol a market awash with cheap Android clones may even be counterproductive.
- Smartphones are as much fashion statements as a utilitarian devices meaning that design of the handset, operating system and application softeware is crucial. Here is where I think Apple have the edge. The key principles at the core of Apple are superior design and controlling the entire end to end user experience.
Will Android overtake the iPhone - almost certainly, but will that be at the expense of Apple. I don't think so. I suspect that it will be Nokia, Microsoft and RIM have more to fear from this weeks announcements.
Monday, February 15, 2010
I think I need an App for that
You may not know it but we are now in the middle of a new tech land grab as established iPhone developers desperately scramble to get native iPad versions of their apps ready for the new platform. This land grab has been measured by the incredible uptake of the iPad SDK.
And before you say it - yes I know that the vast majority of the existing 150,000 iPhone Apps will upscale and run on the iPad but that's a stopgap at best. No doubt new iPad owners will want shiny new iPad apps.
The land grab is underway because the smart developers know that there's a narrow window of opportunity before the AppStore becomes awash with tens of thousands of iPad apps.
No here's the thing I find most interesting. Viewing the AppStore today on my iPhone I can basically flick through 20 categories and about 300 apps per category - give or take, assuming I have an hour or so spare, that is. That's 6,000 apps I can access before I have to resort to a search strings or the genius recommendation.
This is tip of the iceberg stuff. There are another 144,000 apps out there that it's highly unlikely that I'll every find, unless they're featured in some way. Hence my problem how do I know that these apps exist if they're not featured and I can't easliy access them?
Maybe I need a better way of searching for the apps. You know what - I think I need an App for that!
And before you say it - yes I know that the vast majority of the existing 150,000 iPhone Apps will upscale and run on the iPad but that's a stopgap at best. No doubt new iPad owners will want shiny new iPad apps.
The land grab is underway because the smart developers know that there's a narrow window of opportunity before the AppStore becomes awash with tens of thousands of iPad apps.
No here's the thing I find most interesting. Viewing the AppStore today on my iPhone I can basically flick through 20 categories and about 300 apps per category - give or take, assuming I have an hour or so spare, that is. That's 6,000 apps I can access before I have to resort to a search strings or the genius recommendation.
This is tip of the iceberg stuff. There are another 144,000 apps out there that it's highly unlikely that I'll every find, unless they're featured in some way. Hence my problem how do I know that these apps exist if they're not featured and I can't easliy access them?
Maybe I need a better way of searching for the apps. You know what - I think I need an App for that!
Sunday, February 14, 2010
Object Blues
One of the key differentiators between Mac OSX and Windows, according to quotes attributed to Steve Jobs, is that Mac OSX is Object Orientated. He places this fact at the heart of a key productivity advantage allowing Apple to turn a major release every 18 month or so. Redmond seems to take a lot longer than that for Windows. It's also enabled them to turn out the respective iPhone and iPad OS's relatively quickly.
That's fine as far as it goes but I personally have always had a bit of an issue with OOP (Object Orientated Programming) in the real world in the fact that it isn't exactly consistent with the RDBMS view of the world.
- OOP is Function Centric whereas Relational is Data Centric
IMHO OOP works well when you're writing things that are function centric (like operating systems, flashy web sites, computer games, graphics packages) but is less successful when writing large data handling systems (like data warehouses etl's, high volume transaction systems).
When we have to merge the object and relational world together we have three options. The first is to use an Object Relational Bridge technology or secondly we store Object datatypes inside an RDBMS. Both of the represent the RDBMS is an OO friendly format.
Then of course there's the third way which is to use a OOP language (like Java) and write code that doesn't conform to OOP basics negating all the benefits of Object Orientation. Guess which option I've mainly seen in my travels.
That's fine as far as it goes but I personally have always had a bit of an issue with OOP (Object Orientated Programming) in the real world in the fact that it isn't exactly consistent with the RDBMS view of the world.
- OOP is Function Centric whereas Relational is Data Centric
IMHO OOP works well when you're writing things that are function centric (like operating systems, flashy web sites, computer games, graphics packages) but is less successful when writing large data handling systems (like data warehouses etl's, high volume transaction systems).
When we have to merge the object and relational world together we have three options. The first is to use an Object Relational Bridge technology or secondly we store Object datatypes inside an RDBMS. Both of the represent the RDBMS is an OO friendly format.
Then of course there's the third way which is to use a OOP language (like Java) and write code that doesn't conform to OOP basics negating all the benefits of Object Orientation. Guess which option I've mainly seen in my travels.
Labels:
Apple,
Microsoft,
Object Orientated Programming,
RDBMS
Slumdog Billionaires
Congratulations are due today to Larry Ellison and his BMW Oracle Racing team for bringing home the Americas Cup to, err, America. Back in 1998 I was working as one of Larry's minions when he came down under to win the ill-fated Sydney to Hobart so I know how important this win will mean to him.
What is of interest to me, however, is that the Americas Cup has essentially become a massive ego contest between some of the world's richest billionaires. In this case between one Lawrence J Ellison, 65, US software billionaire and 4th richest man on the planet versus Ernesto Bertarelli, 44, Swiss Biotec billionaire and 52nd richest man.
Checking a list of billionaires on wikipedia I wasn't surprised to see that the computer industry contains more than its fair share including Bill Gates, Steve Ballmer and Paul Allen from Microsoft. Not present was Steve Jobs but at a rumoured $5bn and with Apple's remarkable growth he can't be far off joining that list.
Fortune has shined on Bill, Steve and Larry as some point, aside from dropping out of college, they were at the right place (Silicon Valley) at the right time (early 1970's and the advent of the silicon chip) with the right idea (hardware and/or software for the PC, or database for client server).
My query for today is where and when will the next set of billionaires be sourced. Obviously Larry and Sergey are the real winners from the Internet boom, but it looks like they are the exception of their age rather than the rule.
With hindsight it becomes clear that the early days of the personal computer and client/server computing presented opportunities for hobbyists working for love not money in their garages and with like minded friends in clubs (like the Homebrew Computer Club).
My suspicion is that this was a one off and it is unlikely that the next set of billionaires will be from the hobbyist fold. They will be from the likes of the Biotechs and it's more likely that they will complete their studies than drop out from the likes of MIT or Stanford.
I don't know why but I,for one, find that quite sad.
What is of interest to me, however, is that the Americas Cup has essentially become a massive ego contest between some of the world's richest billionaires. In this case between one Lawrence J Ellison, 65, US software billionaire and 4th richest man on the planet versus Ernesto Bertarelli, 44, Swiss Biotec billionaire and 52nd richest man.
Checking a list of billionaires on wikipedia I wasn't surprised to see that the computer industry contains more than its fair share including Bill Gates, Steve Ballmer and Paul Allen from Microsoft. Not present was Steve Jobs but at a rumoured $5bn and with Apple's remarkable growth he can't be far off joining that list.
Fortune has shined on Bill, Steve and Larry as some point, aside from dropping out of college, they were at the right place (Silicon Valley) at the right time (early 1970's and the advent of the silicon chip) with the right idea (hardware and/or software for the PC, or database for client server).
My query for today is where and when will the next set of billionaires be sourced. Obviously Larry and Sergey are the real winners from the Internet boom, but it looks like they are the exception of their age rather than the rule.
With hindsight it becomes clear that the early days of the personal computer and client/server computing presented opportunities for hobbyists working for love not money in their garages and with like minded friends in clubs (like the Homebrew Computer Club).
My suspicion is that this was a one off and it is unlikely that the next set of billionaires will be from the hobbyist fold. They will be from the likes of the Biotechs and it's more likely that they will complete their studies than drop out from the likes of MIT or Stanford.
I don't know why but I,for one, find that quite sad.
Labels:
Apple,
Bill Gates,
Google,
Larry Elliison,
Miscrosoft,
Oracle,
Steve Jobs
Thursday, February 11, 2010
IT Nirvana
Over the last few posts I have raised the unfulfilled concept that was prevalent in the early days of Open Systems - namely the Corporate or Enterprise Database. This concept was where corporate data would be stored and accessed centrally from a central managed place. Of course the central database never happened and there are lots of reasons for this, but I like to think of the Corporate Database as a kind of IT Nirvana. Twenty years ago if we could show IT managers what their systems landscapes would look like today I think this idea may have taken off and just possibly we could have a achieved that blissful IT state.
Instead we face the following issues:
- Systems complexity - get your local IT architect to print out your current systems landscape. Even better get him to walk through all your interfaces.
- Data Warehouses - Often the holy grail of the data warehouse is the search for a single view of the truth. This is because your multiple systems will all have different versions. If the data was kept in one place then we wouldn't need to build expensive and complex data warehouses.
- Master Data Management - The Corporate Database is the MD solution
- Enterprise Data Bus - Why would we need a bus if all the data sits in the same location?
So are we likely to find spiritual enlightenment in IT soon? Well not so long as we can't see past the next quarter, or FY budget for that matter.
Instead we face the following issues:
- Systems complexity - get your local IT architect to print out your current systems landscape. Even better get him to walk through all your interfaces.
- Data Warehouses - Often the holy grail of the data warehouse is the search for a single view of the truth. This is because your multiple systems will all have different versions. If the data was kept in one place then we wouldn't need to build expensive and complex data warehouses.
- Master Data Management - The Corporate Database is the MD solution
- Enterprise Data Bus - Why would we need a bus if all the data sits in the same location?
So are we likely to find spiritual enlightenment in IT soon? Well not so long as we can't see past the next quarter, or FY budget for that matter.
Luddites Unite
When I first started in IT I was a COBOL programmer and the database I mainly used at that time was IDMSX. We had a large powerfull mainframe that supported over 250 transaction processing applications. The interesting thing was that at a conference my boss got talking to his equivalent from a similar sized organisation. They had the same mix of systems and same sort of user community - only their mainframe had less than half the grunt of ours.
Why? Well they had rejected database technology and kept everything in flat files. I can't vouch for how responsive they were as an IT team but it makes you think.
This highlights something we've all seen happen in IT over time - huge advances in chip, bus, memory, disk and network technology are soaked up by software bloat. OK this is a simplistic argument and in reality corporate IT is doing a lot more than it did 20 years ago, but it still makes me wonder just how those flatfile systems would perform on todays equivalent platforms. I think they'd scream along.
Why? Well they had rejected database technology and kept everything in flat files. I can't vouch for how responsive they were as an IT team but it makes you think.
This highlights something we've all seen happen in IT over time - huge advances in chip, bus, memory, disk and network technology are soaked up by software bloat. OK this is a simplistic argument and in reality corporate IT is doing a lot more than it did 20 years ago, but it still makes me wonder just how those flatfile systems would perform on todays equivalent platforms. I think they'd scream along.
Suspect Packages
In the 70's and 80's companies used to write bespoke software for all their needs. Everything from payroll to financials to core business systems would be written inhouse in COBOL, RPG, PL/1, Pascal and the like. Then in the 80's and 90's the concept of packaged applications became popular. The rationale was that it was better to purchase and implement off-the-shelf packages rather than develop from scratch. This made perfect sense and much of my early IT work was involved in package implementation. Ideally a package could be implemented in months rather than written in years.
Software selection became vital as the business would often need to assess how flexible that package was and how easily it would fit into the existing processes. Working on the Technical Pre-Sales side it was always important to try to get a product fit of 90% or better. Inevitably though there was never a 100% product fit and in these situations there were two potential solutions:
1) For the business to adjust its way of working (sometimes called Business Process Reengineering)
2) For the package to be changed or modified (i.e. mods)
In the vast majority of cases the latter option was chosen as the business was reluctant to take on change. In one scenario I worked on a customer site where 70% of the programs had been modded making you wonder why they had ever selected the package.
The one exception to this model has always been SAP. They have always insisted that BPR is a fundamental principle of their implementations and as far as I know SAP implementations are never quick. Here at the Department of Hopes and Dreams we're currently 3 years into a SAP implementation and the frustration it is causing the business is unprecedented. Makes me wonder whether the advantages of packaged apps are still there.
Software selection became vital as the business would often need to assess how flexible that package was and how easily it would fit into the existing processes. Working on the Technical Pre-Sales side it was always important to try to get a product fit of 90% or better. Inevitably though there was never a 100% product fit and in these situations there were two potential solutions:
1) For the business to adjust its way of working (sometimes called Business Process Reengineering)
2) For the package to be changed or modified (i.e. mods)
In the vast majority of cases the latter option was chosen as the business was reluctant to take on change. In one scenario I worked on a customer site where 70% of the programs had been modded making you wonder why they had ever selected the package.
The one exception to this model has always been SAP. They have always insisted that BPR is a fundamental principle of their implementations and as far as I know SAP implementations are never quick. Here at the Department of Hopes and Dreams we're currently 3 years into a SAP implementation and the frustration it is causing the business is unprecedented. Makes me wonder whether the advantages of packaged apps are still there.
Master Data Blues
Master Data Management is currently one of the hot topics doing the rounds of Enterprise IT. For those unaware of the concept MDM is basically the storing, either physically or virtually, of all corporate reference data in a single MDM repository. Consultancies love to sell MDM solutions for two reasons:
1) They are quite easy to justify
2) MDM technology, whilst expensive, is available
So in theory we should all be swimming in MDM's by now. I don't know about you but I'm not. In fact a friend of mine became the first and only person I know to land a successful MDM Project.
So if we agree that it's the correct thing to do, and that the technology is available then why don't we see MDM's all around.
Simple really - the business side of MDM is very, very hard so before you begin you'd best make sure that the business understand what they are biting off.
Of course, if we'd kept a tight rein on our corporate IT systems in the first place then we wouldn't need MDM would we!
1) They are quite easy to justify
2) MDM technology, whilst expensive, is available
So in theory we should all be swimming in MDM's by now. I don't know about you but I'm not. In fact a friend of mine became the first and only person I know to land a successful MDM Project.
So if we agree that it's the correct thing to do, and that the technology is available then why don't we see MDM's all around.
Simple really - the business side of MDM is very, very hard so before you begin you'd best make sure that the business understand what they are biting off.
Of course, if we'd kept a tight rein on our corporate IT systems in the first place then we wouldn't need MDM would we!
Labels:
Consulting,
IT,
Metadata,
Project Manager,
Software Architect
Benson - 9th Generation PC
I was a huge fan of Paul Woakes' groundbreaking 'Mercenary' computer game, which is probably best remembered for its smooth 3D vector and polygonal graphics. What is perhaps less memorable about the game was that your character was guided through the game by a wiseass sidekick called Benson, a so called 9th generation PC. Benson would alert you when you were under attack, communicate with the locals (the Paylars and Mechanoids), etc. At the time I imagined that Benson was some sort of wearable PDA like device - maybe strapped to your arm, for example. I certainly didn't believe that a 9th generation PC was some beige box, screen and keyboard that you lugged around an alien landscape.
So this has got me thinking - what would the ninth generations of PC look like?
1st - Well a first generation PC is easy - your probably still using one on your desk today
2nd - That would have to be the laptop
3rd - I'm figuring that would be a PDA (Palm Pilots, Psions, Windows CE) or smartphone (iPhone, Blackberry, etc.) - If I'm honest these are pretty close to what I imagined Benson would be twenty years ago - only perhaps wearable.
4th - Is it the tablet? Time will tell.
What the 5th - 9th generations are still open for the likes of futurologists and sci-fi fans to debate, but here are some candidates:
I'm thinking that 5th generation PC's will be devices like an augmented reality e-paper displays as seen in the movie Red Planet. Alternatively they could be wearable PC's that display augmented reality information onto a HUD style goggles/spectacles, or as some have proposed, contact lenses.
Further afield is impossible to predict but if I had to have a stab at it then genrations 6-9 would adopt technologies proposed in many sci-fi movies (Firefox, Strange Days, Existenz, The Lawnmover Man, The Matrix, Johnny Mnemonic, etc.) which interact directly with our brains via some digital thought bridge, as scary as that might seem.
The one thing I can be certain of - that the existing first gen PC won't be around forever.
So this has got me thinking - what would the ninth generations of PC look like?
1st - Well a first generation PC is easy - your probably still using one on your desk today
2nd - That would have to be the laptop
3rd - I'm figuring that would be a PDA (Palm Pilots, Psions, Windows CE) or smartphone (iPhone, Blackberry, etc.) - If I'm honest these are pretty close to what I imagined Benson would be twenty years ago - only perhaps wearable.
4th - Is it the tablet? Time will tell.
What the 5th - 9th generations are still open for the likes of futurologists and sci-fi fans to debate, but here are some candidates:
I'm thinking that 5th generation PC's will be devices like an augmented reality e-paper displays as seen in the movie Red Planet. Alternatively they could be wearable PC's that display augmented reality information onto a HUD style goggles/spectacles, or as some have proposed, contact lenses.
Further afield is impossible to predict but if I had to have a stab at it then genrations 6-9 would adopt technologies proposed in many sci-fi movies (Firefox, Strange Days, Existenz, The Lawnmover Man, The Matrix, Johnny Mnemonic, etc.) which interact directly with our brains via some digital thought bridge, as scary as that might seem.
The one thing I can be certain of - that the existing first gen PC won't be around forever.
Tuesday, February 9, 2010
Home or Away
In the previous post I touched on working from home. Many of us have PC's at home or work laptops on our desks so in theory many of us could be doing the same job from home. In the main, however, we don't and I believe that there are a number of reasons for this:
- Communication - I think face to face communication is great and by working from home we'd lose this. However, as I've already touched on in "The Great Lost Art of Communication" I think this is becoming less relevant in today's workplace. If we're emailing each other arcoss the office floor we might as well be emailing each other across the city.
- Management vision - I don't for one moment think our management have seriously sat down and debated whether they should contemplate offering work from home options for large parts of their workforce.
- Measuring how we work - Easy enough for Sales guys and call centre workers but what about the rest of us. Performance is usually measured on hours and perception of our abilities, etc. at our annual reviews. These can be difficult to assess when you work from home. If we effectively want to measure how our home-based workers are doing then we would need to put a lot more effort into assessing how long tasks should take. Perhaps we even need to a system of paying not based upon hours but based upon achievements!
- Trust - If we still measure by hours then this brings the question of trust into the equation. Often work from home is abused.
None of the above is insurmountable, but they do require management vision and an ability to think outside the current way in which we allocated work and measure our success in achieving our goals.
- Communication - I think face to face communication is great and by working from home we'd lose this. However, as I've already touched on in "The Great Lost Art of Communication" I think this is becoming less relevant in today's workplace. If we're emailing each other arcoss the office floor we might as well be emailing each other across the city.
- Management vision - I don't for one moment think our management have seriously sat down and debated whether they should contemplate offering work from home options for large parts of their workforce.
- Measuring how we work - Easy enough for Sales guys and call centre workers but what about the rest of us. Performance is usually measured on hours and perception of our abilities, etc. at our annual reviews. These can be difficult to assess when you work from home. If we effectively want to measure how our home-based workers are doing then we would need to put a lot more effort into assessing how long tasks should take. Perhaps we even need to a system of paying not based upon hours but based upon achievements!
- Trust - If we still measure by hours then this brings the question of trust into the equation. Often work from home is abused.
None of the above is insurmountable, but they do require management vision and an ability to think outside the current way in which we allocated work and measure our success in achieving our goals.
Congestion Blues
My average work commute speed is about 8mph*, which is just better than the average London rushour car journey at 7mph - worse than the horse and cart achieved at the start of the 20th Century.
Cars are great so what's went wrong? Well the problem is the tarmac and the fact that we all want to use the same bit at the same time. So if that's the problem then what are the possible answers?
- Build more roads - yet studies have shown that new roads encourage people off public transport and into cars with no net benefit to congestion.
- Build great public transport - yet most governments are too bankrupt to fund the sort of public works programs required to make major infrastructure improvements here.
- Work from home - the best commute is the one you don't have to take. With boadband and mobile technology we have the infrastructure and tools that enables us to really embrace this solution, but so far it's only a minority solution.
So what's this got to do with IT? The answer is below:
- Build more roads - is the equivalent of building more interfaces between our systems. Except that in our case each road is made from different materials, using different construction techniques, different traffic lights and different road signs.
- Better public transport - is the equivalent of building an enterise bus architecture. A great potential solution but still very costly and complex.
- Work from home - back in my earlier post "Redundancy Blues" I lamented the lack of the single corporate database. Imagine if all the data our company held was in one database. We would ensure that it ran on the best hardware, backup and recovery would be easy, we wouldn't need to build data warehouses to try to discover a single version of the truth. In short,if we had realised the potential benefits offered to us by Open Systems we'd be in a far better place than we are today.
Oh, well, must go and get my pick axe. I'm building a new road today.
*Please note that I consider myself very lucky that my commute is only half an hour. Across the world many, many people spend about 2+ hours per day commuting to work and back - that's about 25% of the time that they are actually there!
Cars are great so what's went wrong? Well the problem is the tarmac and the fact that we all want to use the same bit at the same time. So if that's the problem then what are the possible answers?
- Build more roads - yet studies have shown that new roads encourage people off public transport and into cars with no net benefit to congestion.
- Build great public transport - yet most governments are too bankrupt to fund the sort of public works programs required to make major infrastructure improvements here.
- Work from home - the best commute is the one you don't have to take. With boadband and mobile technology we have the infrastructure and tools that enables us to really embrace this solution, but so far it's only a minority solution.
So what's this got to do with IT? The answer is below:
- Build more roads - is the equivalent of building more interfaces between our systems. Except that in our case each road is made from different materials, using different construction techniques, different traffic lights and different road signs.
- Better public transport - is the equivalent of building an enterise bus architecture. A great potential solution but still very costly and complex.
- Work from home - back in my earlier post "Redundancy Blues" I lamented the lack of the single corporate database. Imagine if all the data our company held was in one database. We would ensure that it ran on the best hardware, backup and recovery would be easy, we wouldn't need to build data warehouses to try to discover a single version of the truth. In short,if we had realised the potential benefits offered to us by Open Systems we'd be in a far better place than we are today.
Oh, well, must go and get my pick axe. I'm building a new road today.
*Please note that I consider myself very lucky that my commute is only half an hour. Across the world many, many people spend about 2+ hours per day commuting to work and back - that's about 25% of the time that they are actually there!
Monday, February 8, 2010
ROLAP smolap 2
Years ago at ungodly hours BBC2 used to broadcast TV lectures on behalf of the UK's Open University. Many of the films date from the 60's and 70's and are usually quite funny to watch because of the dubious haircuts and fashions prevalent in academia at that time.
I remember one such program caught my attention because it focussed on the SQL language. The lesson extolled the virtues of putting an english like query language in the workplace, implying that your average office worker would be comfortable using it. I suppose compared to the pointer driven databases that were prevalent in the 70's I can understand why they might have thought this way. However in practice companies never let their users loose with SQL and it remains very much an IT development tool. There are a number of reasons for this (cartesian products, anyone?) but one of them is NOT because SQL is an inherently complex language - it isn't - but rather that the data that we work with often is.
This led to the development of simpler dimensional data models which would format the data in a more user friendly manner - mostly with the Star Schema and to a lesser extent the snowflake schema. Fundamentally all OLAP and ROLAP technology is built to utilise these dimensional models.
So what's the problem? Well in a couple of projects recently I've had end users who have rejected the Star Schema as being too complex. In both cases they wanted a single denormalised view of the world - kind of a denormalised fact and dimension all-in-one superset.
Why did they reject the star schema? Well the reasons varied from not wanting to undertsand the complexities of the Star schema (i.e. surrogate join keys, current flags, effective dates, etc.) to a misguided belief that querying a single 'all-in-one' table would outperform a star query.
It gets worse as on one of the BI Managers had come up with the construct of a daily snapshot, such that every day the full denormalised snapshot dataset would be inserted even if only there were no changes to the source data.
In both cases I fundamentally disagreed with the customer and argued on behalf of the star schema. One battle I won and one I lost. Funnily enough though the battle I lost wasn't because of any inherent objection to the star schema - it was lost because the users had built a Visual Basic front end application through which to query and report their data. What they were able to achieve in VB had my Business Objects developers heads spinning. For those reasons I state again. ROLAP is old hat. If you want to keep your BI consumer happy you'll need to start building them some nice apps.
I remember one such program caught my attention because it focussed on the SQL language. The lesson extolled the virtues of putting an english like query language in the workplace, implying that your average office worker would be comfortable using it. I suppose compared to the pointer driven databases that were prevalent in the 70's I can understand why they might have thought this way. However in practice companies never let their users loose with SQL and it remains very much an IT development tool. There are a number of reasons for this (cartesian products, anyone?) but one of them is NOT because SQL is an inherently complex language - it isn't - but rather that the data that we work with often is.
This led to the development of simpler dimensional data models which would format the data in a more user friendly manner - mostly with the Star Schema and to a lesser extent the snowflake schema. Fundamentally all OLAP and ROLAP technology is built to utilise these dimensional models.
So what's the problem? Well in a couple of projects recently I've had end users who have rejected the Star Schema as being too complex. In both cases they wanted a single denormalised view of the world - kind of a denormalised fact and dimension all-in-one superset.
Why did they reject the star schema? Well the reasons varied from not wanting to undertsand the complexities of the Star schema (i.e. surrogate join keys, current flags, effective dates, etc.) to a misguided belief that querying a single 'all-in-one' table would outperform a star query.
It gets worse as on one of the BI Managers had come up with the construct of a daily snapshot, such that every day the full denormalised snapshot dataset would be inserted even if only there were no changes to the source data.
In both cases I fundamentally disagreed with the customer and argued on behalf of the star schema. One battle I won and one I lost. Funnily enough though the battle I lost wasn't because of any inherent objection to the star schema - it was lost because the users had built a Visual Basic front end application through which to query and report their data. What they were able to achieve in VB had my Business Objects developers heads spinning. For those reasons I state again. ROLAP is old hat. If you want to keep your BI consumer happy you'll need to start building them some nice apps.
Sunday, February 7, 2010
Avatar and the Real Time Data Warehouse
Last night I saw a feature on the making of Avatar. What caught my attention about this was the way in which they had created a 'virtual camera' through which James Cameron could observe the live action of his actors morphed into their avatar bodies and displayed in the CGI generated world of Pandora - in REAL-TIME.
In my business world, which is about as far removed from Pandora as you could get, I build data warehouses and have been doing so for many a year. From time to time the concept of Real-Time Data Warehousing has arisen and largely been dismissed for the following technical reasons:
- Data Dependencies
- Slowly Changing Dimenions
- Complex Transformations
- Updating Aggregate/Summary Data and Cubes
All of the above are still valid, and there would still need to be a valid business reason to go to the extra cost and expense of building a real-time data warehouse as opposed to a cheaper batch one.
However, I'm sure the technical hurdles that the Avatar movie makers overcame must have been a whole lot longer than my sorry list. Maybe it's time for a rethink!
In my business world, which is about as far removed from Pandora as you could get, I build data warehouses and have been doing so for many a year. From time to time the concept of Real-Time Data Warehousing has arisen and largely been dismissed for the following technical reasons:
- Data Dependencies
- Slowly Changing Dimenions
- Complex Transformations
- Updating Aggregate/Summary Data and Cubes
All of the above are still valid, and there would still need to be a valid business reason to go to the extra cost and expense of building a real-time data warehouse as opposed to a cheaper batch one.
However, I'm sure the technical hurdles that the Avatar movie makers overcame must have been a whole lot longer than my sorry list. Maybe it's time for a rethink!
Thursday, February 4, 2010
What goes around
Every wondered how over the last 20 years or so popular culture has begged, borrowed and stolen everything from the past fifty years or so. I think we're currently upto in 1983 in this playback. Goodness knows where we will go after we've revisited grunge because I don't think anything original has been created since then.
Strangely Enterprise IT has it's cycles to:
Centralised IT - Mainframe and green screen dumb terminal
Client Server IT - Midrange UNIX boxes and desktop PC's. Custom built GUI Applications.
Distributed IT - n-Tier applications, middleware, web and application servers and browser delivered applications.
The problem with this model is that every time we've added something we've also taken away.
Green screens were great for data entry but try viewing a BI dashboard on one. GUI Apps were much prettier than browser apps but required all those installation and network overheads. Browsers are great for distribution and access from anywhere but often have a woeful user interface.
I've been waiting for the last 10 years for the next cycle to arrive. One candidate was the advent of Rich Internet Applications like Ajax, Flash or Curl but frankly I'm still waiting for these to arrive in Enterprise IT.
If the iPad get's picked up for the Enterprise them maybe all that will change and well start to see Apps being developed that are user friendly yet easy on the infrastructure. Isn't that a little bit like a return to the Client Server model.
Strangely Enterprise IT has it's cycles to:
Centralised IT - Mainframe and green screen dumb terminal
Client Server IT - Midrange UNIX boxes and desktop PC's. Custom built GUI Applications.
Distributed IT - n-Tier applications, middleware, web and application servers and browser delivered applications.
The problem with this model is that every time we've added something we've also taken away.
Green screens were great for data entry but try viewing a BI dashboard on one. GUI Apps were much prettier than browser apps but required all those installation and network overheads. Browsers are great for distribution and access from anywhere but often have a woeful user interface.
I've been waiting for the last 10 years for the next cycle to arrive. One candidate was the advent of Rich Internet Applications like Ajax, Flash or Curl but frankly I'm still waiting for these to arrive in Enterprise IT.
If the iPad get's picked up for the Enterprise them maybe all that will change and well start to see Apps being developed that are user friendly yet easy on the infrastructure. Isn't that a little bit like a return to the Client Server model.
Netbook Blues
Here at the Department of Hopes and Dreams one of the key strategic initiatives underway at present is Kevin Rudd's Digital Education Revolution (DER). In essence this plan involves providing all teenage school students with a netbook computer. Over time elements of the curriculum will hook into the equipment transforming the education experience. I'm no expert in this field but even I can see the potential here so in one sense hats off to Kevin and Julia for the program. More about that later.
A few years ago I gave a hand me down PC to my Mum as she showed interest in getting on the internet. Despite my Mum attending a few computer courses she's still a technophobe at heart meaning that the PC is effectively used for solitaire, skype, e-mail and a little bit of internet. As we now live 9,000 miles apart Skype Video has become by far the most important of these applications.
A few months ago she got a virus ultimately requireing a complete PC reinstall, a new ISP software CD, etc. Long story short is that the outage lasted about 3 weeks and was only fixed by using the services of a PC repair man.
Now back to the DER. Here in NSW some 500 Technical Support Officers (TSO's) - effectively glorified PC repair men - have been hired to support the program. Maybe it would have been better just to pick a better device than an underpowered PC running windows that wouldn't crash so much - say an iPad or a Google Tablet running Chrome perhaps?
Speaking of which, when Steve gets round to putting a front camera in the iPad I figure it'll be just the deveice for my technophobic Mum.
A few years ago I gave a hand me down PC to my Mum as she showed interest in getting on the internet. Despite my Mum attending a few computer courses she's still a technophobe at heart meaning that the PC is effectively used for solitaire, skype, e-mail and a little bit of internet. As we now live 9,000 miles apart Skype Video has become by far the most important of these applications.
A few months ago she got a virus ultimately requireing a complete PC reinstall, a new ISP software CD, etc. Long story short is that the outage lasted about 3 weeks and was only fixed by using the services of a PC repair man.
Now back to the DER. Here in NSW some 500 Technical Support Officers (TSO's) - effectively glorified PC repair men - have been hired to support the program. Maybe it would have been better just to pick a better device than an underpowered PC running windows that wouldn't crash so much - say an iPad or a Google Tablet running Chrome perhaps?
Speaking of which, when Steve gets round to putting a front camera in the iPad I figure it'll be just the deveice for my technophobic Mum.
Wednesday, February 3, 2010
The next iPhone Killer please stand up
In the movie 'The Right Stuff' there's a scene where the wannabe test pilots enviously discuss Chuck Yeager and how every time a challenger comes along he just suits up and pushes the flight envelope a little bit further.
I think the following list of phones must feel a little bit like those wannabes as, at product launch, each one has been dubbed as a potential iPhone killer:
- Nokia N95
- T-mobile G1
- Blackberry Storm
- HTC Magic
- Palm Pre
- HTC Hero
- Nokia N97
- Motorola Driod
- Google Nexus One
The important thing to understand in their own way each of these phones has had something to offer that was better than the equivalent iPhone it was pitched against, whether that be a improved camera, physical keypad, multitasking, push email, etc. All have tried yet so far none have even come close to usurping Cupertinos touchscreen miracle. You can speculate yourself as to the reasons why.
What must be particularly exasperating for their competitors is how little Apple has actually had to do to stay at the top of the pyramid. In the 3 years since its launch we've seen one case redesign, a couple of processor speed bumps, extra storage, a slight camera improvement and evolutionary software improvements. That's it.
They always say that true greats make everything look simple. I guess you could say the same for Apple and the iPhone.
I think the following list of phones must feel a little bit like those wannabes as, at product launch, each one has been dubbed as a potential iPhone killer:
- Nokia N95
- T-mobile G1
- Blackberry Storm
- HTC Magic
- Palm Pre
- HTC Hero
- Nokia N97
- Motorola Driod
- Google Nexus One
The important thing to understand in their own way each of these phones has had something to offer that was better than the equivalent iPhone it was pitched against, whether that be a improved camera, physical keypad, multitasking, push email, etc. All have tried yet so far none have even come close to usurping Cupertinos touchscreen miracle. You can speculate yourself as to the reasons why.
What must be particularly exasperating for their competitors is how little Apple has actually had to do to stay at the top of the pyramid. In the 3 years since its launch we've seen one case redesign, a couple of processor speed bumps, extra storage, a slight camera improvement and evolutionary software improvements. That's it.
They always say that true greats make everything look simple. I guess you could say the same for Apple and the iPhone.
The Price of Everything, The Value of Nothing
Back in the days when I wore a proferssional services hat the T&M (time and materials) contract was prevalent. T&M was good for a consultancy as it loaded most of the the risk on the client. I also think it did a good job of focussing the client minds as the last thing they wanted was expensive consultants sitting idle whilst they provaracated over making a decision.
Over time clients more often moved to a Fixed Price model for their consulting engagements which helped control their costs. It also shifted the risk to the consulting firm and for that I know that my consultancy would add a margin of 30% to offset that risk. What the clients failed to understand in the shift from T&M to Fixed Price was how that would change the mindset of the consultancies that they engaged.
Back in the T&M world we really focussed on doing things right. We knew we were expensive and we knew we had deadlinesto meet. I can honestly say that when I worked on site I believe I acted in the clients best interests. The problem with Fixed Price is that the consultancy focusses on doing as little as possible to cover its contractual obligations.
Hence the client may think they're getting a better deal by fixing the price - but you know what - I'm pretty sure they haven't figured out how to measure whether they're getting good value or not.
Over time clients more often moved to a Fixed Price model for their consulting engagements which helped control their costs. It also shifted the risk to the consulting firm and for that I know that my consultancy would add a margin of 30% to offset that risk. What the clients failed to understand in the shift from T&M to Fixed Price was how that would change the mindset of the consultancies that they engaged.
Back in the T&M world we really focussed on doing things right. We knew we were expensive and we knew we had deadlinesto meet. I can honestly say that when I worked on site I believe I acted in the clients best interests. The problem with Fixed Price is that the consultancy focusses on doing as little as possible to cover its contractual obligations.
Hence the client may think they're getting a better deal by fixing the price - but you know what - I'm pretty sure they haven't figured out how to measure whether they're getting good value or not.
The MobCon War
A little over a month ago in my post 'The IT Wars' I decried the lack of a current IT War. Like it or not wars are the engine of innovation hence the old quote that in 300 years of peace the Swiss invented the Cuckoo Clock whilst in the Second World War the protagonists invented the Radar, Rockets, the Jet engine and the Atom bomb.
My earlier assessment was that the following IT Wars had been fought and the victors had prospered.
The desktop wars: Microsoft Windows vs IBM OS/2
The server wars: Mainframe vs midrange
The RDBMS wars: Oracle vs Ingres
The browser wars: Internet Explorer vs Netscape
The Search Engine Wars: Google vs Yahoo
In my defence the focus of my original post was commercial IT but rereading the post in light of the second reading, and also that fact that on occasion my blog has strayed into the MobCon space I figured it was time to put the record straight and comment on the current war at hand.
To paraphrase the Excapite blog: The MobCon Wars. If you want to understand the MobCon I'd recommend some essential reading at http://excapite.wordpress.com/
There are lots of protagonists originating from the technology, telcos and media fields but I believe that this war will be fought most bitterly between Apple and Google. Up until a couple of years ago these two tech giants seemed to be peacefully co-existing and focussed more on attacking Redmond. Eric Schmidt even had a seat at the Apple high table. There are even rumours that the two companies has a no poaching of staff agreement.
Judging by the recent second hand quotes attributed to Steve Jobs in a recent Town Hall meeting it would seem that relations are somewhat strained between the two tech goliaths. He appears to be somewhat aggrieved that the Mountain View mob have strayed onto his turf with the Nexus One and potential Chrome based tablet whilst Apple to date have stayed out of the Search Engine field. Nothing like a bit of siege mentality to get the minds focussed and steel your troops for combat.
You can place your bets on the eventual winner but if I was a betting man I'd have to put my money on Apple. After all they're a company I choose to spend my hard earned cash with. When's the last time you bought anything from Google?
My earlier assessment was that the following IT Wars had been fought and the victors had prospered.
The desktop wars: Microsoft Windows vs IBM OS/2
The server wars: Mainframe vs midrange
The RDBMS wars: Oracle vs Ingres
The browser wars: Internet Explorer vs Netscape
The Search Engine Wars: Google vs Yahoo
In my defence the focus of my original post was commercial IT but rereading the post in light of the second reading, and also that fact that on occasion my blog has strayed into the MobCon space I figured it was time to put the record straight and comment on the current war at hand.
To paraphrase the Excapite blog: The MobCon Wars. If you want to understand the MobCon I'd recommend some essential reading at http://excapite.wordpress.com/
There are lots of protagonists originating from the technology, telcos and media fields but I believe that this war will be fought most bitterly between Apple and Google. Up until a couple of years ago these two tech giants seemed to be peacefully co-existing and focussed more on attacking Redmond. Eric Schmidt even had a seat at the Apple high table. There are even rumours that the two companies has a no poaching of staff agreement.
Judging by the recent second hand quotes attributed to Steve Jobs in a recent Town Hall meeting it would seem that relations are somewhat strained between the two tech goliaths. He appears to be somewhat aggrieved that the Mountain View mob have strayed onto his turf with the Nexus One and potential Chrome based tablet whilst Apple to date have stayed out of the Search Engine field. Nothing like a bit of siege mentality to get the minds focussed and steel your troops for combat.
You can place your bets on the eventual winner but if I was a betting man I'd have to put my money on Apple. After all they're a company I choose to spend my hard earned cash with. When's the last time you bought anything from Google?
Tuesday, February 2, 2010
ROLAP smolap
One of the first ROLAP tools that I came across was Oracle's Discoverer product. As one of Larry's consultants I lead a Data Warehouse Team that delivered our reports using it when it was a brand new product. So new in fact that the client didn't realise that the paint hadn't dried on it and it was actually pre production software. They assumed that Discoverer 3.0 had been preceeded by versions 1.0 and 2.0. and there's another story in there about trusting Oracle Sales and Marketing, but I digress.
Some 12 years later I came across Oracle Discoverer again. To my suprise very little appears to have changed. The EUL and full client looked almost identical. I'm sure that under the hood there have been some changes for intranet, pdf's and web delivery but I'm still a bit amazed about the lack of innovation in the ROLAP world.
Business Objects finally seem to be getting things together with BOXI R3 and I have to admit that I haven't seen Cognos's stuff for a while so for them I can't comment.
The only real innovation I've seen in the last few years was ProClarity, before they were swallowed up by Microsoft, but that's OLAP and not ROLAP offering.
A few years ago I started to believe that the Reporting tools were stagnating and that cubes - whether OLAP or ROLAP weren't the answer. I hoped that the move into RIA (Rich Internet Applications) and tools like Curl would fill that gap but as yet nothing seems to have developed there.
Maybe new platforms like the iPhone and more importantly the iPad will spur the sleeping Reporting giants into a new series of innovation. I hope so.
Some 12 years later I came across Oracle Discoverer again. To my suprise very little appears to have changed. The EUL and full client looked almost identical. I'm sure that under the hood there have been some changes for intranet, pdf's and web delivery but I'm still a bit amazed about the lack of innovation in the ROLAP world.
Business Objects finally seem to be getting things together with BOXI R3 and I have to admit that I haven't seen Cognos's stuff for a while so for them I can't comment.
The only real innovation I've seen in the last few years was ProClarity, before they were swallowed up by Microsoft, but that's OLAP and not ROLAP offering.
A few years ago I started to believe that the Reporting tools were stagnating and that cubes - whether OLAP or ROLAP weren't the answer. I hoped that the move into RIA (Rich Internet Applications) and tools like Curl would fill that gap but as yet nothing seems to have developed there.
Maybe new platforms like the iPhone and more importantly the iPad will spur the sleeping Reporting giants into a new series of innovation. I hope so.
Labels:
BI,
Business Objects,
Curl,
Data Warehousing,
iPad,
iPhone,
Microsoft,
OLAP,
Oracle,
ROLAP
Apple's BusinessAppStore
With the launch of the iPad, Apple will have three content based online stores. These are:
- iTunes
- AppStore
- iBookstore
What I'm wondering today is whether they need a fourth store dedicated for Business? What I mean by that is that could be potentially hundreds of thousands of business apps that companies might want to deliver internally but not make available to the world at large.
For example, what if I wanted to develop a front end dashboard application for my EIS? As I've already stated in a previous post we are being pressured to deliver pdf reports to our CFO's iPhone which breaks our Warehouse Security model. The potential could be enormous but then so would be the challenges.
For a start - how would Apple charge for a BusinessAppStore. Currently they take 30% of the revenue from all paid iPhone Apps. Security would also be an issue to, of course. But I think the idea has legs.
Just out of interest I noted the other day that there was a SAP Business Objects App for the iPhone so it's obvious that not only Game Developers see interest in Apples devices in a business context.
Expand this concept wider and hook it into Apple's cloud platform, MobileMe, and we really could see something of interest.
- iTunes
- AppStore
- iBookstore
What I'm wondering today is whether they need a fourth store dedicated for Business? What I mean by that is that could be potentially hundreds of thousands of business apps that companies might want to deliver internally but not make available to the world at large.
For example, what if I wanted to develop a front end dashboard application for my EIS? As I've already stated in a previous post we are being pressured to deliver pdf reports to our CFO's iPhone which breaks our Warehouse Security model. The potential could be enormous but then so would be the challenges.
For a start - how would Apple charge for a BusinessAppStore. Currently they take 30% of the revenue from all paid iPhone Apps. Security would also be an issue to, of course. But I think the idea has legs.
Just out of interest I noted the other day that there was a SAP Business Objects App for the iPhone so it's obvious that not only Game Developers see interest in Apples devices in a business context.
Expand this concept wider and hook it into Apple's cloud platform, MobileMe, and we really could see something of interest.
Labels:
Apple,
AppStore,
Cloud Computing,
iBookstore,
iPad,
iPhone,
iTunes,
MobileMe
My Online Doppelgänger
It seems that a few months after I started my IT Journeyman blog that I have an online doppelgänger. That's OK because I'm not the jealous type.
Having skimmed said blog the following post http://www.itjourneyman.com/2010/01/16/data-warehouse-2nd-time-is-a-charm caught my interest and it's essentially a rehash of a few white papers on "pitfalls/mistakes to avoid when building data warehouses". The long and the short of the post is that your first data warehouse will be a failure but don't worry because the second one will learn from those lessons and succeed.
I love to say that this was true but imho its just not that simple. In my travels I've worked on first stab data warehouses that have been blinding successes and also third tries that have had no more luck than their predecessors.
There are lots of elements that go into making a data warehouse project succeed or fail and often the initial expectation setting exercise is crucial. We have to be very careful in determining the criteria of what makes a data warehouse work and what doesn't.
It's a bit like marriage and divorce. Most people would assume that definition a fifty year marriage must have succeeded - but what if the husband and wife were at each others throats for the duration. Likewise divorce after 10 years is seen as failure but what if you've produced a couple of wonderful and well adjusted kids and went your own way amicably. Expectation is everything.
What I can say is that in my experience Data Warehouse projects are difficult and that's why I choose to work in that field and not implemeting somebody elses off the shelf package.
Data Warehouse Projects are voyages of discovery and it's what we learn along the way and not necesarily where we end up that's really important. The problem is that most organisations and most PM's just don't understand that yet.
Having skimmed said blog the following post http://www.itjourneyman.com/2010/01/16/data-warehouse-2nd-time-is-a-charm caught my interest and it's essentially a rehash of a few white papers on "pitfalls/mistakes to avoid when building data warehouses". The long and the short of the post is that your first data warehouse will be a failure but don't worry because the second one will learn from those lessons and succeed.
I love to say that this was true but imho its just not that simple. In my travels I've worked on first stab data warehouses that have been blinding successes and also third tries that have had no more luck than their predecessors.
There are lots of elements that go into making a data warehouse project succeed or fail and often the initial expectation setting exercise is crucial. We have to be very careful in determining the criteria of what makes a data warehouse work and what doesn't.
It's a bit like marriage and divorce. Most people would assume that definition a fifty year marriage must have succeeded - but what if the husband and wife were at each others throats for the duration. Likewise divorce after 10 years is seen as failure but what if you've produced a couple of wonderful and well adjusted kids and went your own way amicably. Expectation is everything.
What I can say is that in my experience Data Warehouse projects are difficult and that's why I choose to work in that field and not implemeting somebody elses off the shelf package.
Data Warehouse Projects are voyages of discovery and it's what we learn along the way and not necesarily where we end up that's really important. The problem is that most organisations and most PM's just don't understand that yet.
Labels:
BI,
Data Warehousing,
IT,
Project Manager,
Software Development
7x24
If you've been around in IT for a while you've probably come across the term 7x24 meaning 100% system uptime.
I was once employed in London by a Investment Bank as a DBA where we were developing a mission critical global options trading system. Luckily the data volumes were small, the servers and environments stable and I'd had plenty of time to work through a reliable hot standby failover solution with an excellent UNIX Sysadm. All was good in my world.
Then during the preparations for go-live the topic of Availability arose. The Project Manager threw into the mix that we had to guarantee 7x24 availability.
My response was that we could aim for 100% uptime excluding planned outages but that we couldn't guarantee it. This resulted in a bit of table-thumping, as was quite often in IT projects in an Investment Bank.
Suffice it to say that when I explained the costs and complexities involved in guaranteeing high that availability from a solutions side and the human side it the PM became a bit more reasonable, especially when I threw in the fact that neither Scott McNealy nor Larry Ellison could guarantee 100% uptime on the configuration of Solaris and Oracle that the solution was constructed.
So the lesson is that before you start discussing High Availability the metric that needs to be understood is the actual cost, either in dollars or reputation, to your business of the mission critical app being unavailable. Until you have that there's really no point in discussing the HA requirements of the system. The funny thing is that when I was consulting I designed lots of Technical Architectures and never once could I get that fact out of the client.
I was once employed in London by a Investment Bank as a DBA where we were developing a mission critical global options trading system. Luckily the data volumes were small, the servers and environments stable and I'd had plenty of time to work through a reliable hot standby failover solution with an excellent UNIX Sysadm. All was good in my world.
Then during the preparations for go-live the topic of Availability arose. The Project Manager threw into the mix that we had to guarantee 7x24 availability.
My response was that we could aim for 100% uptime excluding planned outages but that we couldn't guarantee it. This resulted in a bit of table-thumping, as was quite often in IT projects in an Investment Bank.
Suffice it to say that when I explained the costs and complexities involved in guaranteeing high that availability from a solutions side and the human side it the PM became a bit more reasonable, especially when I threw in the fact that neither Scott McNealy nor Larry Ellison could guarantee 100% uptime on the configuration of Solaris and Oracle that the solution was constructed.
So the lesson is that before you start discussing High Availability the metric that needs to be understood is the actual cost, either in dollars or reputation, to your business of the mission critical app being unavailable. Until you have that there's really no point in discussing the HA requirements of the system. The funny thing is that when I was consulting I designed lots of Technical Architectures and never once could I get that fact out of the client.
Kindle Surprise
Today I saw my second Kindle on my commute home. Unlike my first encounter I didn't feel that it was a LOL moment, but neither did I come away with any sense of envy regarding the device. I probably categorise it as an interesting piece of technology but one that I will pass on.
Monday, February 1, 2010
Assisting the Police with their inquiries
Back in 1998 I was doing some Pre-Sales Consulting for an Account Manager trying to sell a Data Warehouse solution to a local state police force. I badgered the salesman to let me use the above title as a tagline on the demo but unsurprisingly he didn't see the funny side.
During the demo the thorny question of Metadata came up. More precisely - Consolidated Metedata. As I'd just come off a project where I'd defined the Metadata Architecture and Solution I was well qualified to answer the query.
At the time we had three sources of metadata for our solution. These were:
- The Database Data Dictionary
- The CASE/Data Modeling tool in use
- The ROLAP Semantic Layer
Note that this we didn't use an ETL product that would have been a fourth source of Metadata.
Now the interesting thing here is that all the software was written by the same company in the same software labs so one would hope that some level of shared metadata would be possible. Alas no. Not only did the metadata in each repository overlap but there was no easy way of combining it into a single source of consolidated metadata repository.
I answered the question honestly that nobody had a good story here, not us nor our competition. I think the client appreciated my honesty here. The account manager obviously not wanting to leave a bad impression did what all account managers are prone to do and started promising vaporware with some cock and bull story about the software labs in California working on that problem.
The interesting thing is that here we are over a decade later and I've still to see a good answer to this problem.
During the demo the thorny question of Metadata came up. More precisely - Consolidated Metedata. As I'd just come off a project where I'd defined the Metadata Architecture and Solution I was well qualified to answer the query.
At the time we had three sources of metadata for our solution. These were:
- The Database Data Dictionary
- The CASE/Data Modeling tool in use
- The ROLAP Semantic Layer
Note that this we didn't use an ETL product that would have been a fourth source of Metadata.
Now the interesting thing here is that all the software was written by the same company in the same software labs so one would hope that some level of shared metadata would be possible. Alas no. Not only did the metadata in each repository overlap but there was no easy way of combining it into a single source of consolidated metadata repository.
I answered the question honestly that nobody had a good story here, not us nor our competition. I think the client appreciated my honesty here. The account manager obviously not wanting to leave a bad impression did what all account managers are prone to do and started promising vaporware with some cock and bull story about the software labs in California working on that problem.
The interesting thing is that here we are over a decade later and I've still to see a good answer to this problem.
Taking the Mountain to Mohammed
I've been working in the field of Data Warehousing for some 13 years now. Actually my first every data warehouse was a Reporting System I did back in 1992 long before I'd ever heard the terms DW & BI but that's another story.
The interesting thing that, so far, has been a constant in all that time, no matter what style of Data Warehouse (from full blown Inmon Corporate Information Factory to Kimball Federated Data Marts), is that we extract data from source systems and move it and load it into a data warehouse (be it an EDW, Data Mart, ODS, RDS, whatever). We'll use terminology like ETL, OLAP, ROLAP, Cubes, Star Schemas, Metadata, Slowly Changing Dimensions, etc. along the way to baffle the business and make ourselves seem clever but fundamentally any data warehouse or data mart involves moving data from a source system into target reporting system.
Back in the 90's this made perfect sense because it was inconceivable that we could slap resource consuming queries on reports against the mission critical core business systems.
Nowadays that just not the case. There are many technical solutions out there that could enable us to place a large and significant batch query and reporting load against our production data that would have zero impact on the core business systems. Technologies that spring to mind include Server Virtualisation, Disk Replication and Mirroring, O/S and Database Parallel Server technologies, etc.
The question is why don't we employ these technologies? I suspect that in the field of DW & BI we're in a stuck in a Kimball or Inmon rut and that for the time being we will continue to Take the Mountain to Mohammed.
Ah, but what about history I hear you ask? Well yes it's true that we often capture history in the data warehouse that we cannot keep in our online systems but often the need and justification for history is overstated. Besides another way in which we could keep all the history we'd ever need (and we probably already do this to some degree anyway) is to ensure that all PDF reports that are produced are kept online in some fashion. There are alternatives if we are creative.
Maybe within the decade well see a shift away from this and let Mohammed walk to the mountain for a change.
The interesting thing that, so far, has been a constant in all that time, no matter what style of Data Warehouse (from full blown Inmon Corporate Information Factory to Kimball Federated Data Marts), is that we extract data from source systems and move it and load it into a data warehouse (be it an EDW, Data Mart, ODS, RDS, whatever). We'll use terminology like ETL, OLAP, ROLAP, Cubes, Star Schemas, Metadata, Slowly Changing Dimensions, etc. along the way to baffle the business and make ourselves seem clever but fundamentally any data warehouse or data mart involves moving data from a source system into target reporting system.
Back in the 90's this made perfect sense because it was inconceivable that we could slap resource consuming queries on reports against the mission critical core business systems.
Nowadays that just not the case. There are many technical solutions out there that could enable us to place a large and significant batch query and reporting load against our production data that would have zero impact on the core business systems. Technologies that spring to mind include Server Virtualisation, Disk Replication and Mirroring, O/S and Database Parallel Server technologies, etc.
The question is why don't we employ these technologies? I suspect that in the field of DW & BI we're in a stuck in a Kimball or Inmon rut and that for the time being we will continue to Take the Mountain to Mohammed.
Ah, but what about history I hear you ask? Well yes it's true that we often capture history in the data warehouse that we cannot keep in our online systems but often the need and justification for history is overstated. Besides another way in which we could keep all the history we'd ever need (and we probably already do this to some degree anyway) is to ensure that all PDF reports that are produced are kept online in some fashion. There are alternatives if we are creative.
Maybe within the decade well see a shift away from this and let Mohammed walk to the mountain for a change.
Where is the Henry Ford of IT?
I can only imagine that Henry Ford was an amazing man. His invention of the production line is probably the greatest commercial achievement of the 20th Century. Here's my question for today - Does IT need its own Henry Ford?
If we compare the production line analagy to IT development we will traditionally end up with something based upon the SDLC (System Development Lifecycle). That's fine as far as it goes but as always the devil is in the detail. In all my 20 years industry experience I've never been able to use the same development process unchanged between sites. Think about that for a moment. Every IT Development Team has had different processes, standards, tollgates, etc. but we're all effectively doing the same thing. I should be able to move from one job to another and technology aside be instantly productive. However, as a new developer to a site we spend longer dancing our way through the process minefield than we ever do in writing code. Something just isn't right there.
If we compare the production line analagy to IT development we will traditionally end up with something based upon the SDLC (System Development Lifecycle). That's fine as far as it goes but as always the devil is in the detail. In all my 20 years industry experience I've never been able to use the same development process unchanged between sites. Think about that for a moment. Every IT Development Team has had different processes, standards, tollgates, etc. but we're all effectively doing the same thing. I should be able to move from one job to another and technology aside be instantly productive. However, as a new developer to a site we spend longer dancing our way through the process minefield than we ever do in writing code. Something just isn't right there.
Rancid Aluminium2: 26 billion smackers down the gurgler
In a recent report it was stated that the UK Govt under NuLabor had wasted about GBP26bn on failed IT initiatives. That's about $2bn for every year in office with an ROI of zero. Just think about that for a moment?
Maybe some of that money was wasted on building 1,700 websites of which only 431 will remain by the end of 2010 after recommendations that most be culled in a recent audit.
Meanwhile James Cameron spends about GBP180mn making 'Avatar' over 4 years, whilst pioneering new technology and gets an ROI of over GBP1bn in less than three months.
Honestly, the UK would have been wiser investing this money in Cameron's Lightstorm and Peter Jackson's WETA and conservatively could have made a profit of GBP100bn which is half the money that the BofE has printed with its policy of Quantative Easing to bail out the banks.
OK it doesn't work like that and we know that when UK Goverment money finds it's way into the arts (via Lottery funding) we end up with films like 'Rancid Aluminium' and not Cameron's smash hit.
What puzzles me is that the government still tries to run large IT projects anymore because everyone know that it's just a licence for government approved suppliers to print money.
I don't know how much the US has spent on intelligence related IT projects post 911 but what I do know is that they failed to stop a known terrorist suspect from boarding a flight on Christmas Day.
So what's my point?
Over the last decade I've had the pleasure to work with two managers who successfully defined how they would structure and govern large projects in order to avoid the wastage so profligate in government IT spend.
The first even wrote a thesis about how large projects are inherently more difficult and risky to land than smaller ones. The second built an IT governance framework that consisted of a few simple groundrules:
- all projects to be sponsored by the business without exception
- no project to last more than 9 months. Any piece of work identified larger than this would be broken into phases less than 9 months in duration.
- no project to cost more than GBP2m
Sounds simple and yes it works, but what about when you need to do the big projects? Well I guess we probably need Project Managers of the calibre of James Cameron for that otherwise you're better off saving your pennies for a bailing out a bank or two.
Maybe some of that money was wasted on building 1,700 websites of which only 431 will remain by the end of 2010 after recommendations that most be culled in a recent audit.
Meanwhile James Cameron spends about GBP180mn making 'Avatar' over 4 years, whilst pioneering new technology and gets an ROI of over GBP1bn in less than three months.
Honestly, the UK would have been wiser investing this money in Cameron's Lightstorm and Peter Jackson's WETA and conservatively could have made a profit of GBP100bn which is half the money that the BofE has printed with its policy of Quantative Easing to bail out the banks.
OK it doesn't work like that and we know that when UK Goverment money finds it's way into the arts (via Lottery funding) we end up with films like 'Rancid Aluminium' and not Cameron's smash hit.
What puzzles me is that the government still tries to run large IT projects anymore because everyone know that it's just a licence for government approved suppliers to print money.
I don't know how much the US has spent on intelligence related IT projects post 911 but what I do know is that they failed to stop a known terrorist suspect from boarding a flight on Christmas Day.
So what's my point?
Over the last decade I've had the pleasure to work with two managers who successfully defined how they would structure and govern large projects in order to avoid the wastage so profligate in government IT spend.
The first even wrote a thesis about how large projects are inherently more difficult and risky to land than smaller ones. The second built an IT governance framework that consisted of a few simple groundrules:
- all projects to be sponsored by the business without exception
- no project to last more than 9 months. Any piece of work identified larger than this would be broken into phases less than 9 months in duration.
- no project to cost more than GBP2m
Sounds simple and yes it works, but what about when you need to do the big projects? Well I guess we probably need Project Managers of the calibre of James Cameron for that otherwise you're better off saving your pennies for a bailing out a bank or two.
Sunday, January 31, 2010
140,499 Reasons why the iPad will be a hit
I wanted to give it a few days before commenting about the iPad in order to let the effects of Steve's Reality Distortion Field to abate. Now that it has, here's my take on it.
Most of the tech journoblogs were underwhelmed by the device - and it's hard to argue against the technical critiques of the device. The iPad does have some serious technical deficiencies including, but not limited to:
- no front camera
- no flash support
- no multitasking
Ultimately I believe that none of this will matter and the iPad will be another big hit for Apple. Why? The following numbers are crucial:
- 140,000 - the genius of this device is that pretty much all iPhone Apps will run out of the box on the iPad. The iPhone and iPad SDK's are complementary meaning that anyone who has invested in building iPhone Apps can easily convert to building iPad Apps.
- 499 - the low price point is crucial especially when establishing a new category of device. Would you spend a grand on an iPad when you can get a netbook for half that. Probably not. Reduce the entry point to $499 and the answer will probably be different. I know where I'd be putting my cash.
As to the technical criticisms of the device I'm mainly in agreement with the tech press about their nature, but don't think that this will seriously dampen demand.
- No Front Camera - This is a curious one because I can see no good reason for the omission. There also seem to be some hooks in the iPad SDK to a camera indicating that if not present now a camera will be included in a future update. Maybe its a cost thing or maybe the software wasn't up to scratch. I suspect however that the lack of inclusion may be that the Apple A4 chip just might not have had enough grunt for something like Skype video. I know that my MacBook Pro runs pretty hot when running Skype so that may be a factor.
- Multitasking - Not an issue on the iPhone this may become a real bugbear for the iPad. Time will tell. I suspect that at the heart of this is a philosophical debate about how we will use mobile devices in the future and where we as humans need to multitask in the same way we do with a PC.
- No Flash Player - There seems to be a real spat between Apple and Adobe at the heart of this. Essentially the lack of flash hasn't detracted from the iPhone internet experience. Does anyone imagine that consumers will balk at the point of purchase because there's no Flash support?
Other observations, and what I find most curious about the iPad launch, are the following in no particular order:
- Lacklustre - For me it was one of the lest effective Apple product launches of late. At times they were almost apologetic "you have to hold the device in your hands to understand". Maybe Apple are correct that the real effect won't be understood till people are holding these things for real but compared to the iPhone launch I though this one lacked punch. However the free marketing and hype machine has been in overdrive and for many consumers the soundbyte and image of an iPad in Steve's hands is enough to guarantee sucess.
- To consume or create? That is the question. The PC has always suceeded because it is a flexible creation device. The Walkman, Gameboy and iPod likewise worked because they are great consumption devices. To look at the iPad specs you'd probably think it's more a consumption device (a big iPod Touch) and that would be fine. However then Apple throw in iWork at $10 a module but also leave out iLife. This is interesting because iLife (iPhoto, iMovie, iDVD, GarageBand) has always been bundled free with Macs. It's obvious that Apple cannot bundle iLife into the iPad and keep the price point low so why develop iWork for the iPad and sell it for a few bucks? My thoughts here are that Apple has big ambitions for the iPad in the workplace but won't agressively market it as such. They will let private individuals champion the device in the way the iPhone is in the workplace today. In a year or two from now I can see iPads cropping up on desks next to PC's and being used for calendar, email, internet together with the usual office suspects (word processing, speadsheets and presentations). They won't instantly replace the desktop or Microsoft Office but over the next decade I believe we will start to see the use of PC's diminish as they are replaced by tablet computers.
- Limitations? Remember the original iPhone? It was panned by the critics for being 2G. People were suspicious about touchscreen phones and the lack of a tactile keyboard. I suspect that we will look back at the first generation iPad in the same way. It's a placeholder for the main event if you like.
- MultiTouch - There was nothing new here really on the UI front, but again the beauty of the software keyboard and touchsceen means that any new developments can be introduced over time.
- The new Goldrush - What's interesting that Apple used the term "goldrush" in their hype when describing the iPad AppStore. If I remember from North American Goldrushes of the 1800's it was the merchants and traders selling pick axes that got rich and not the prospectors.
To summ up - Is the iPad all it could have been - probably not, but has Apple done enough to establish a new type of device. Definately.
Most of the tech journoblogs were underwhelmed by the device - and it's hard to argue against the technical critiques of the device. The iPad does have some serious technical deficiencies including, but not limited to:
- no front camera
- no flash support
- no multitasking
Ultimately I believe that none of this will matter and the iPad will be another big hit for Apple. Why? The following numbers are crucial:
- 140,000 - the genius of this device is that pretty much all iPhone Apps will run out of the box on the iPad. The iPhone and iPad SDK's are complementary meaning that anyone who has invested in building iPhone Apps can easily convert to building iPad Apps.
- 499 - the low price point is crucial especially when establishing a new category of device. Would you spend a grand on an iPad when you can get a netbook for half that. Probably not. Reduce the entry point to $499 and the answer will probably be different. I know where I'd be putting my cash.
As to the technical criticisms of the device I'm mainly in agreement with the tech press about their nature, but don't think that this will seriously dampen demand.
- No Front Camera - This is a curious one because I can see no good reason for the omission. There also seem to be some hooks in the iPad SDK to a camera indicating that if not present now a camera will be included in a future update. Maybe its a cost thing or maybe the software wasn't up to scratch. I suspect however that the lack of inclusion may be that the Apple A4 chip just might not have had enough grunt for something like Skype video. I know that my MacBook Pro runs pretty hot when running Skype so that may be a factor.
- Multitasking - Not an issue on the iPhone this may become a real bugbear for the iPad. Time will tell. I suspect that at the heart of this is a philosophical debate about how we will use mobile devices in the future and where we as humans need to multitask in the same way we do with a PC.
- No Flash Player - There seems to be a real spat between Apple and Adobe at the heart of this. Essentially the lack of flash hasn't detracted from the iPhone internet experience. Does anyone imagine that consumers will balk at the point of purchase because there's no Flash support?
Other observations, and what I find most curious about the iPad launch, are the following in no particular order:
- Lacklustre - For me it was one of the lest effective Apple product launches of late. At times they were almost apologetic "you have to hold the device in your hands to understand". Maybe Apple are correct that the real effect won't be understood till people are holding these things for real but compared to the iPhone launch I though this one lacked punch. However the free marketing and hype machine has been in overdrive and for many consumers the soundbyte and image of an iPad in Steve's hands is enough to guarantee sucess.
- To consume or create? That is the question. The PC has always suceeded because it is a flexible creation device. The Walkman, Gameboy and iPod likewise worked because they are great consumption devices. To look at the iPad specs you'd probably think it's more a consumption device (a big iPod Touch) and that would be fine. However then Apple throw in iWork at $10 a module but also leave out iLife. This is interesting because iLife (iPhoto, iMovie, iDVD, GarageBand) has always been bundled free with Macs. It's obvious that Apple cannot bundle iLife into the iPad and keep the price point low so why develop iWork for the iPad and sell it for a few bucks? My thoughts here are that Apple has big ambitions for the iPad in the workplace but won't agressively market it as such. They will let private individuals champion the device in the way the iPhone is in the workplace today. In a year or two from now I can see iPads cropping up on desks next to PC's and being used for calendar, email, internet together with the usual office suspects (word processing, speadsheets and presentations). They won't instantly replace the desktop or Microsoft Office but over the next decade I believe we will start to see the use of PC's diminish as they are replaced by tablet computers.
- Limitations? Remember the original iPhone? It was panned by the critics for being 2G. People were suspicious about touchscreen phones and the lack of a tactile keyboard. I suspect that we will look back at the first generation iPad in the same way. It's a placeholder for the main event if you like.
- MultiTouch - There was nothing new here really on the UI front, but again the beauty of the software keyboard and touchsceen means that any new developments can be introduced over time.
- The new Goldrush - What's interesting that Apple used the term "goldrush" in their hype when describing the iPad AppStore. If I remember from North American Goldrushes of the 1800's it was the merchants and traders selling pick axes that got rich and not the prospectors.
To summ up - Is the iPad all it could have been - probably not, but has Apple done enough to establish a new type of device. Definately.
Sunday, January 17, 2010
Open Source meets Closed Minds
Maybe I've learned a few things over the years. Then again maybe I haven't. In the previous post I've tried to highlight how I bought into the concept of Open Systems, especially the aspect of portability and how twenty years later we've somehow failed to fully exploit all the potential benefits.
The second Open concept that came along was that of Open Source. Basically, this is the idea that a bunch of sad geeks sitting in their bedrooms with nothing better to do with their time will develop a bunch of software for free that will be as good, if not better, that commercially available software.
There are thousand of Open Source software products out there (Linux, MySQL, Apache, Firefox, Open Office are just a few) and the best part is that they're absolutely free. You'd have thought that this would have turned the IT world upside down - but no. Somehow we, and by that I mean corporate IT, have failed to embrace Open Source as perhaps we should have.
I'm sure there are lots of reasons for this but fundamentally I suspect that we just don't trust something we don't pay for. Here's a quote I heard the other day from the Enterprise Architect I'm currently working with. "I don't trust Linux or Apache. I'd much rather have something (in this case from Microsoft) that is properly supported".
The guy is delusional if he believes that because he pays liscence fees and maintenance he will somehow get better service. I know - I've worked in a couple of software houses. This just shows that no matter how Open the software it that it will fail when it meets with closed minds.
The second Open concept that came along was that of Open Source. Basically, this is the idea that a bunch of sad geeks sitting in their bedrooms with nothing better to do with their time will develop a bunch of software for free that will be as good, if not better, that commercially available software.
There are thousand of Open Source software products out there (Linux, MySQL, Apache, Firefox, Open Office are just a few) and the best part is that they're absolutely free. You'd have thought that this would have turned the IT world upside down - but no. Somehow we, and by that I mean corporate IT, have failed to embrace Open Source as perhaps we should have.
I'm sure there are lots of reasons for this but fundamentally I suspect that we just don't trust something we don't pay for. Here's a quote I heard the other day from the Enterprise Architect I'm currently working with. "I don't trust Linux or Apache. I'd much rather have something (in this case from Microsoft) that is properly supported".
The guy is delusional if he believes that because he pays liscence fees and maintenance he will somehow get better service. I know - I've worked in a couple of software houses. This just shows that no matter how Open the software it that it will fail when it meets with closed minds.
Where did my Open Systems go?
Early in my career I bought into the concept of 'Open Systems' in a big way. I'd been working on proprietary mainframe systems for a couple of years and figured that 'Open' had to be the way to go.
For those unitiated in the world of Open Sytsems it effectively was a euphamism for UNIX and RDBMS technology. The 'Open' basically mean that systems could be developed on one platform (say Sun) and ported to another (i.e HP or IBM) without much effort. Portability was the key to keeping your vendors sweet and it was also ensure that cross skilling your staff was easier. The same was theoretically possible with SQL based databases.
So did this happen? Not really. I used to work in a software porting team for a UNIX/RDBMS based product and because we applied strict standards to keep our code open we managed the transition without too much effort. This was not typical though. Most IT guys out there will tell you that porting from one server or database to another is a major undertaking.
That has resulted in the vast majority of IT shops out there often selecting a particular flavour of UNIX or RDBMS as its system of choice and that kind of defeats the whole argument of Open Systems if you think about it.
For those unitiated in the world of Open Sytsems it effectively was a euphamism for UNIX and RDBMS technology. The 'Open' basically mean that systems could be developed on one platform (say Sun) and ported to another (i.e HP or IBM) without much effort. Portability was the key to keeping your vendors sweet and it was also ensure that cross skilling your staff was easier. The same was theoretically possible with SQL based databases.
So did this happen? Not really. I used to work in a software porting team for a UNIX/RDBMS based product and because we applied strict standards to keep our code open we managed the transition without too much effort. This was not typical though. Most IT guys out there will tell you that porting from one server or database to another is a major undertaking.
That has resulted in the vast majority of IT shops out there often selecting a particular flavour of UNIX or RDBMS as its system of choice and that kind of defeats the whole argument of Open Systems if you think about it.
Subscribe to:
Posts (Atom)