"Open" will find a way
By way of
Don Box, I just came across
this story-ette.
evilapi.com - what a great domain!
For those who haven't been following the story of
Google's removal of its SOAP API to its search engine, you can catch up on the
news so far at Slashdot.
As Don says, don't you just love the web? Open standards and universal markets mean that if there's demand for something you have, the market will find a way to help fulfil it - whether you as a supplier are ready or not.
I know plenty of other people have provided good insight into the challenges and opportunities around mashups, but to me this is a stark example of the issues faced by every enterprise offering services or resources on the web.
With mashup techniques becoming more and more widely understood, from a technical perspective at least your online assets are pretty much public property. If you're not prepared to offer facilities to make things easy to consume, then it could be that someone else will provide those facilities.
And you might not even know it.
Merry Christmas and Happy New Year! I'll be back in 2007.
Labels: google scraping, mashups
Even standards organisations aren't immune to twodotoism
The Liberty Alliance, which does great work in the world of federated identity technology standards, policies, guidelines etc, has
succumbed to the 2.0 bug. On the 22nd January it will be holding the "Liberty 2.0" workshop but don't let that put you off. The excellent line-up of speakers (and I am talking from experience) will be covering the Identity Web Services Framework (ID-WSF) which, as I discussed
here, addresses user-centric as opposed to enterprise-centric federation scenarios.
ID-WSF is not the only user-centric identity initiative in town, though, so I hope the press release lives up to its promise and will feature experts in OpenID, which is rapidly becoming a significant force. Without interoperability, user-centric identity is a non-starter. In that regard, it's encouraging to note that the Higgins project (see
here,
here and
here) has a slot on the agenda.
If I wasn't in the UK I think it would be worth a day of my time.
Another SOA podcast appearance
Jon is pretty busy at the moment finishing off
the book (and I have a couple of minutes before some more editing) so I thought I would highlight his latest podcast over at BriefingsDirect where, together with a number of other independent analysts, they reviewed SOA in 2006 and made some predictions for 2007. You can download the MP3
here or read the full transcript
here.
A useful primer on SOA governance
I just came across this whitepaper from webMethods (who is not a client)
SOA Governance: Enabling Sustainable Success with SOA. Putting to one side the fact that this is from a vendor and the checklist in the Appendix is clearly oriented towards webMethods offerings - based on
the acquisition of Infravio earlier in the year - I have to say I am pretty impressed.
Too much of the discussion of SOA governance focuses on the design-time: adherence to standards, such as the WS-I profiles, schema validation etc. It ignores the fact that IT services, like services in the real world (think your mobile/cell phone service), are experienced by the customer, which is about more than just what is built by the provider. Because services are experienced, SOA governance must extend to the encompass the complete service lifecycle, from development through to operations: something which is acknowledged in the webMethods paper.
That being said, I do have a couple of quibbles:
- Business involvement is called out during service change but not in the definition of the quality of service agreements, which comes across as the domain of the IT organisation. Business involvement is essential here to capture expectations and ensure that metrics are presented in a business-meaningful way
- Service contracts are highlighted in the discussion of the SLAs - "how well" a service is performed - but not in terms of the functionality provided by the service - the "what" - and the commercial aspects of service provision - the "how much". If an SOA approach is to really deliver business value then it must be possible for business and IT to establish some common ground in terms of service expectations and comprehensive service contracts, which encompass all of the aspects of those expectations, do that.
Bearing in mind those two important consderations, the paper is worth a few minutes (as are our reports on
SOA and
SOA quality management which provide our perspective on some of these issues).
Labels: governance, SOA
Cutting out the middleman
According to
this story at News.com BEA will be announcing plans to release WebLogic Server Virtual Edition in the first quarter of 2007. According to the story, this is a version of the company's WebLogic Java application server which does not need an operating system. Instead, it runs on a version of BEA's JRockit Java virtual machine modified to run directly on VMware's virtualisation hypervisor.
I was at VMware's
VMworld 2006 conference at the beginning of the November and VMware discussed this collaboration with BEA as part of its broader
virtual appliance initiative (see
this post from Stephen O'Grady over at RedMonk for some detailed analysis of virtual appliances), so this news doesn't come as a total surprise.
The story quotes Guy Churchward, VP of WebLogic products at BEA:
Our goal was to double the utilization by running natively and to double the performance.
Removing the operating system and exploiting the consolidation benefits of virtualisation is certainly consistent with Guy's utilisation claims: I am not so sure about doubling the performance though given the overheads associated with a virtualisation layer (although perhaps BEA partner
Azul Systems might be able to help out with its
Java Compute Appliances)
Putting those promised benefits to one side, however, this announcement does raise a number of other questions in my mind. What implications does this have for BEA's licensing which has been based on the traditional per CPU (and per core) model? How is BEA going to deal with performance optimisation, given that all of the expertise developed by the company and its customers is based on the existence of an intervening operating system. Similarly for product support, patching and management. According to the News.com story BEA will be providing a management console, Liquid Operations Control, later next year so it will be interesting to see the extent to which it addresses some of these challenges. Unless it does, customers will struggle to realise the claimed benefits. Customers will also have to rethink, hopefully with some guidance from BEA, deployment architectures since the application server is only one component of any solution and in most scenarios those other components are not going to be virtual appliances (at least in the short term).
BEA clearly hopes to exploit the significant amount of interest in virtualisation to generate interest in the WebLogic Server and should also benefit from virtual appliance evangelism from market leader VMware. However, VMware is not the only virtualisation offering out there: there's Microsoft with Virtual Server and the Longhorn-era hypervison; XenSource with it's paravirtualisation; Sun with Solaris Containers and a range of offerings. According to the article, BEA does plan to support other virtualisation solutions. Aside from the significant engineering effort that entails, I am not so sure that the likes of Microsoft and Sun are going to be too happy with a proposition based on cutting out the operating system.
This is definitely something to watch and it is going to be interesting to see how the likes of IBM, Oracle and SAP respond.
Maintenance, innovation and half-baked pies
I'm getting just a little bit bored of a certain slide that seems to appear in every single IT vendor's enterprise pitch at the moment. It's the one with the pie chart - where about 70% of the pie is allocated to "Maintenance" and about 30% relates to "Innovation". The theory is that CIO's are looking to reduce the amount they spend on the former, so they can free up resource for the latter.
On the surface, this appears all well and good. Scratch a little beneath the glaze however, and things become far less simple:
- few companies have a clear idea of the size of their own pie. In the discussions we have had around
the book, IT executives have been telling us how difficult it can be to get a clear picture of spend, for new technology projects or for maintenance of older systems. This is true particularly if IT responsibility is devolved to the lines of business.
- In similar discussions, we are told that projects are more and more being driven by quite stringent business cases. While the budget totals may add up to the 30%, this is because the line has to be drawn, rather than any "here's a piece of pie" view.
- To extend on this, organisations that see themselves as technology-driven are looking to the business benefit of any technology as well as looking at its intrinsic cost - more of a whole-meal view.
- Perhaps for these reasons, the pie itself is shrinking. As discussed by
Nicholas Carr a few weeks ago, IT budgets are reducing, and the maintenance side is coming down faster than the innovation side.
- Finally, what does an innovation become, the day after deployment? Why, maintenance of course. There are plenty of new projects going on that are in fact replacing older systems with updated versions - as illustrated by
Dale Vile's recent SAP post.
The pie analogy as it stands is not completely wrong, but it is over-used and simplistic. Where it may be true is that the CFO says to the CIO, "We're not going to give you any new money for project X, you're going to have to fund it yourself." In which case of course, it is inevitable that money needs to come out of one part of the budget, to shore up the other.
However, the suggestion that one side of the pie is shrinking and the other is growing, is a leap too far. It is also a dangerous starting point, suggests my colleague Neil Macehiter: "The key point is that innovation without a stable foundation where you understand your key assets, their costs and the value they add to the business, will mean that the only thing you innovate is chaos. If you simply shift budget, without stabilising and consolidating the foundation, you're heading for trouble."
So, what's the alternative? Rather than drawing the line pre-and post-deployment, a better place to start is to distinguish between IT investments that relate to non-differentiating parts of the business, and IT investments that help the organisation differentiate itself from the competition. Of course organisations will still have to work out what IT they have, and where it adds value; but if the goal is to rob Peter to pay Paul, it is a far better approach to drive costs out of the non-differentiating parts of IT so that the differentiating parts can be funded, extended and improved upon, whether they be in maintenance or otherwise.
We might still end up with a pie, but at least it would be fully baked.
Avoiding technobabble
One of the principles we are advocating in our
book is the need to create a common language between business and IT to ensure that there is a common understanding of what the business is trying to achieve, relative priorities, the role of IT, measurement of the business value of IT and so forth. A critical aspect of that common language is the avoidance of technobabble and a focus on the what not the how.
This
post from Andrew McAfee regarding "Enterprise 2.0" highlights why this is so critical. In it, Andrew points to a number of definitions provided by contributors at
Sandhill.com, including
M.R. Rangaswami and
Philip Lay:
Enterprise 2.0 is the synergy of a new set of technologies, development models and delivery methods that are used to develop business software and deliver it to users
[T]he three main pillars of Enterprise 2.0 [are] open source programming languages such as the interactive web application development tool Ajax (Asynchronous JavaScript XML), the increasing number of SaaS offerings, and the highly anticipated appearance of hundreds of SOA application modules in the form of re-usable web services.
and highlights the likely response from non-IT business leaders:
I felt that we'd lose the attention of non-IT business leaders as soon as we defined E2.0 as anything except the process of using the new social tools within companies. There's a large role for business leaders in this process, and I've found that they're willing, even eager, to discuss what that role is.
They're a lot less eager to hear about software as a service, open source development methods, offhsoring. Their eyes glaze over when these topics come up, and I can almost hear them think "We have an IT department so I don't have to think about these things.Absolutely!
One can imagine a similar glazing over of the eyes as IT organisations try to sell the technology capabilities of SOA, virtualisation, federated identity, composite applications etc to the business instead of focussing on what those capabilities enable.
Andrew's conclusion says it all:
IT advocates too often make claims about technologies that are disproportionate to, or even divorced from, what the technologies actually do. Is it any wonder, then, that the IT-business dialog is so tenuous and marked by skepticism, and that IT so rarely has a real seat at the table during strategic discussions? How long should we expect businesspeople to believe technologists' assertions that a brave new world is at hand, and that it can be inhabited after one more set of trends comes to fruition?