IBM to buy Telelogic: Rational, but not inspirational
I know in the blogosphere, waiting a few days to provide comment on an announcement like
this one hardly puts me at the leading edge - but hey. Although I can't claim to be breaking any news, there are a couple of other points about IBM's purchase of Telelogic that I think are worth making.
As many other commentators have explained, in one respect the purchase of Telelogic takes Rational back to its roots as a tools provider to assist with the development of embedded systems. In this respect, the purchase of Telelogic is really all about IBM capturing market share and consolidating the market for engineering tools for complex systems development. This analysis was given extra weight by comments made by Danny Sabbah, the General Manager of IBM's Rational business, when he stressed that investment in Telelogic's tools and capabilities would absolutely continue. If IBM keeps the commitment made by Danny Sabbah, Telelogic customers can breathe a sigh of relief, and so will IBM. Embedded software developers love their tools, and many Telelogic customers will have made an explicit decision in the past not to go with Rational.
When you look at this (the biggest) part of Telelogic's business it's clear that this isn't just the usual story of mature markets consolidating, however. Back in 1999 I spent more months than I care to remember on
this project, which convinced me it was only a matter of time before ubiquitous broadband networking, consumer electronics and the digitisation of content would open up major new markets for tools vendors. The "pervasive computing and content" thing is starting to happen in earnest, in a variety of sectors - including automotive, healthcare, consumer electronics, retail and travel. Where these things come together, consumer-friendly (and that means high-performance, highly-reliable, bulletproof) software is appearing in more places and in more guises. This is new market opportunity, and Telelogic gives IBM the chance to grab more of it.
So far, so Rational.
But what's been more interesting of late, to me at least, is not Rational's heritage but it's future direction. IBM has made it clear that Rational's focus is shifting from being a provider of development tools to being a provider of tools to help manage the process of software delivery - and helping customers
turn IT inside-out. A big gap here has been in the provision of tools that really help customers model above the level of individual systems, and the surprise to me has been that although Telelogic has these (obtained when it bought Popkin back in 2005) IBM's early talk hasn't put much emphasis on their value. To me this is a major missed opportunity as it's a capability that more and more "mainstream" businesses with IT organisations are starting to realise that they need. Enterprise Architecture competency is quite thin on the ground and IBM has a chance to take a significant step forward in guiding customers here.
Assuming IBM does start to think and talk a bit more about Telelogic's enterprise modelling tools (which would seem to make sense, you'd think) my take is that this is one area where the technology would be best served within Rational's own product management structure rather than under Telelogic. As Rational moves its focus more towards managing the process of software development, the Telelogic assets naturally form a specialised sub-piece within the overall picture - but System Architect fits naturally alongside things like
RAM,
RMC,
RPM, and some other stuff I can't talk about yet.
So - I really hope we start to see more from IBM about how the more mainstream capabilities of Telelogic will be taken forward, and if/how it will start to separate those mainstream capabilities from the specialist "complex systems engineering" capabilities.
Labels: acquisition, architecture, EA, ibm, inside-out IT, Telelogic
Shock, horror: Microsoft and Concordia
Labels: identity, interoperability, Liberty, Microsoft, OpenID
Microsoft's Dynamic IT: it's a start
I have just returned from a couple of days in Orlando, where I attended a Microsoft Server and Tools Business analyst summit which coincided with the company's TechEd conference. The RedMonkers
James and
Coté did a great job of live blogging the event (
here,
here,
here,
here,
here and
here) - and there was even some Twittering - but I needed the joys of a 9 hour transatlantic flight to collect my thoughts.
The big news at TechEd and the focus of the analyst summit was Microsoft's
Dynamic IT for the People-Ready Business (Dynamic IT) strategy, which the company describes as building
on the company’s Dynamic Systems Initiative and ongoing Application Platform efforts to provide customers with the key areas of technical innovation necessary to make their IT and development organizations more strategic to the businessIn other words it's a framework which builds on a number of Microsoft's most significant, but historically largely disconnected, initiatives which is designed to help customers understand how they can be combined to increase the business value of IT. This is long overdue, for a couple of reasons.
First, whilst Microsoft has used language in the past which implies linkage between the different initiatives and associated products, such as 'design for operations' for DSI and .NET, it's not always been clear how the implication becomes reality. For example, how do the System Center management tools exploit operational policy requirements defined in Visual Studio and how do those requirements map to policies defined in Windows Communication Foundation? Dynamic IT sets out to make the linkage explicit.
Second, Microsoft has lacked a cross-company vision for enterprise IT (for want of a better term) within which to frame discussions with customers and around which it can rally the troops. I'm thinking here of things like IBM's On Demand, HP's Business Technology, Oracle's Fusion etc. There's People-Ready of course but I think that's about more than Enterprise IT. Dynamic IT provides Microsoft with a competitive alternative and one that is more reflective of current reality than future aspiration.
There are four aspects to Dynamic IT where Microsoft plans to focus innovation:
- unified and virtualized
- process-led, model-driven
- service-enabled
- user-focused
built on a federated, interoperable and secure foundation. Obviously, it's still very early days but I do think Microsoft has a lot of work to do if it's going to achieve what I believe it hopes to with Dynamic IT.
For example, in his keynote when Bob Muglia talked about process-led, model-driven he discussed process-led in terms of the application lifecycle, BizTalk, Windows Workflow Foundation and Office Business Applications and model-driven in terms of System Center and IT management models (based on Service Modelling Language and the Common Model Library). What he didn't do was explain the relationship between the two. When describing service-enabled, he focussed on .NET, SOA, web services and software plus services, primarily from the bottom-up, developer perspective (consistent with
Microsoft's initial foray into SOA) but failed to tie that into the end-to-end service lifecycle -
Big SOA - and thus process-led, model-driven. (As an aside, I think Microsoft is also missing a trick when it comes to information and data as a service but that's for another day).
As well as explaining the relationships between the different aspects of Dynamic IT, Microsoft also has to be very careful that it doesn't fall back into the trap of using it simply as a framework for categorising its products. Increasingly, the key concerns of the people it is trying to reach with Dynamic IT don't fall into neat product categories and Microsoft has struggled in the past to articulate the joined-up propositions required to address these concerns because of its focus on product stovepipes (as I discussed
here).
What I think Microsoft needs, as I explained during various meetings at the summit, are scenarios and associated case studies to bridge between the framework and the products and emphasise the linkage. This will also serve to highlight the importance of the three foundational aspects - federated, interoperable and secure - which might otherwise be lost and to tie into
Core,
Application Platform and
Business Productivity Infrastructure Optimization roadmaps which Microsoft is using to help customers understand how they move forward from where they are today.
For Microsoft's customers and potential customers Dynamic IT is a positive sign that company is beginning to recognise that you are more concerned with the outcomes from deploying the company's technologies than you are about the technologies themselves or the way that Microsoft chooses to structure itself to develop and sell them. Over the coming months you should be looking to Microsoft to fill out the framework and seek explanations for how the pieces fit together today and how the company plans to enhance that integration going forward.
Labels: HP, ibm, Microsoft, Oracle, SOA
Turning IT inside out and the trouble with ITSM and BSM
The other day Martin Atherton over at our partner Freeform Dynamics
got me thinking again about the IT service management (ITSM) / application management / business service management (BSM) hoopla we've long been saddled with in the IT industry.
I can absolutely see why vendors would want to try and avoid being seen as "just" providers of ITSM tools and make themselves look more "businessy". It's another example of the stack race phenomenon you see in so many areas - development tools, middleware, etc - and the simple idea is that if you can make your offering look and sound as if it can help customers talk more effectively to businesspeople, it's better than an offering that is a bit oily under the fingernails.
And I absolutely believe there's a place for tools that can help customers explain the value of IT investments in a way that makes sense to the people who pay the bills.
The problem is that the vast majority of the technology and practice out there does nothing of the sort - at least not without the expenditure of a lot of blood, sweat and tears. To characterise the ITSM/app management/BSM "stack" probably crassly unfairly, all that happens as you move higher up the stack is that events and alerts are correlated at ever more abstract levels. Events from routers, servers and switches are aggregated to give higher-level views of health and performance of infrastructure; infrastructure events are correlated with stats from DBMS instances, application servers, web servers and more to give higher-level views of health and performance of "applications"; and information at the application level can sometimes be aggregated further.
But fundamentally all we're doing is reporting on more chunky technology outcomes. The outcomes we're reporting on are still technology outcomes. The insight is about performance, uptime, security, and so on. There is no business context.
I could argue that we do have technology that can help provide business context to ITSM, app management and BSM - business activity monitoring (BAM). But to focus first on technology is missing the point.
The real underlying point is that do really manage services that make sense in a business context, the whole mindset of the IT organisation has to be turned inside out. IT organisations have to stop focusing so much on internal perspectives of process improvement and efficiency (are we doing things right?), and start focusing a bit more on a more external perspective (are we doing the right things?)
To pursue this idea of "inside out IT" into the software development realm, let's not forget - as I said to an audience of CIOs last night: you can be at CMMi level x and still not guarantee that the things you do will drive business value; instead you *can* turn out irrelevant systems, but in a very predictable way.
Labels: BSM, inside-out IT, ITSM, methods, tools