advising on IT-business alignment
IT-business alignment about us blog our services articles & reports resources your profile exposure
blog
blog
Thursday, December 22, 2005

An unseasonal gift for Microsoft from the EC

As Microsoft heads in to the seasonal break it does so with the threat of daily fines of 2 million Euros, rather than sprigs of mistletoe, hanging over its head, according to Reuters. The European Commission is clearly in no mood to kiss and make up.

The EC has issued a Statement of Objections for failing to comply with one of the remedies imposed following the EC decision that Microsoft had abused its near OS monopoly. At issue is the EC's view that Microsoft has failed to provide comprehensive and accurate documentation to allow third parties to interoperate with the Windows - a view which is backed up by two reports from Professor Neil Barrett, a UK computer scientist who was appointed as an independent monitor in October.

EC competition commissioner Neelie Kroes didn't mince her words:



I have given Microsoft every opportunity to comply with its obligations. However, I have been left with no alternative other than to proceed via the formal route to ensure Microsoft’s compliance.


The formal route mentioned being a potential daily fine of 2 million Euros. Clearly, Professor Barrett took a dim view of the quality and usability of the Microsoft documentation.

The clock is now ticking and Microsoft has 5 weeks to respond before the Commission can begin the process of imposing the fines, retrospectively from the 15 December until Microsoft complies. Microsoft's lawyers and technical writers may not get the holiday break they were hoping for.

For a company with Microsoft's resources and technical nous, I find it somewhat difficult to believe that it has been unable to document a set of interfaces, particularly since it has had ample time and feedback from interested parties. I can only assume that it is a delaying tactic. 5 weeks may not seem like much of a delay but the EC is going to have to go through additional administrative steps to impose the fine and then again to extend it. And even if a fine is imposed 2 million Euros/day is hardly going to get the Microsoft bean counters scrutinising the cash flow.
Of course, this is only part (and a much less complex part) of the story. The other remedy imposed by the EC requires that Microsoft make the interoperability information available on reasonable terms. With the open source community, particularly the Free Software Foundation, lobbying hard, the EC and Professor Barrett have their work cut out simply determining what constitutes "reasonable" and for whom, let alone whether Microsoft satisfies that determination. It is therefore unsurprising that this is still being evaluated.

Given the delays around the documentation and the complexity of the licensing terms, it seems highly likely that the second anniversary of the EC's original decision in March next year is going to pass without a satisfactory resolution.

UPDATE
Microsoft has responded to the EC's objections claiming they are unjustified. Microsoft General Council Brad Smith is quoted as saying:

We have now responded to more than 100 requests from the Commission. We
continue working quickly to meet the Commission's new and changing demands. Yet every time we make a change, we find that the Commission moves the goal post and demands another change

Now I am not a lawyer but it seems to me that the EC has requested comprehensive and understandable documentation and an independent expert (one of a list provided by Microsoft) has said they have not provided it. That doesn't seem to me to be a shift in the goal posts: the goal posts are in the same place but Microsoft has failed to shoot between them.

Smith then goes on to claim that enabling interoperability:

can open the door to the production of clones of parts of the Windows ... The Commission confuses disclosure of the source code with disclosure of the internals and insists that it will fine the company if it fails to address this


I have not had sight of the documentation and the extent of the interfaces but it seems to me that Smith is overstating the case somewhat. Facilitating interoperability, as Microsoft knows full well given its work on web services (e.g. with the use of WS-Management to manage Windows Server Release 2), COM etc, does not necessarily require that the internals are exposed to a sufficient level to enable cloning.

Smith says that Microsoft will contest the statement to the full extent permitted under EU law, which includes a full Oral Hearing. Such a hearing will no doubt take months to arrange, adding further delay to the process (any fines will not be enforced until Microsoft's objections are considered).

If the NY Times article is to be believed, I also think Smith has done himself and Microsoft no favours by claiming that neither the EC nor Professor Barratt had reviewed the documentation properly

Wednesday, December 21, 2005

IBM continues its pre-Christmas shopping spree

Unlike the rest of us, IBM doesn't seem to want to wait for the January sales! Having bolstered its composite application development capabilities with yesterday's acquisition of Bowstreet, it followed-up today with the proposed (subject to shareholder approval) far heftier $865M acquisition of San Francisco-based and (and excuse the parochialism) UK-originated Micromuse.

Micromuse plugs an obvious gap in IBM's IT service management portfolio, bringing much-needed network discover, monitoring and fault management capabilities - based on Netcool/OMNIbus, Proviso and Precision - as well as security event management - based on Micromuse's July acquisition of GuardedNet - and application-level impact analysis - based on Netcool/Impact.

With enterprises and service providers alike grappling with the management challenges of IP-based network convergence and new services, such as VoIP and IMS, IBM needed a credible response. This acquisition certainly seems to do that, particularly as there is already integration between Netcool and IBM's Tivoli Enterprise Console, NetView and Change and Configuration Management Database (via the November acquisition of Collation and its Confignia product).

Since the acquisition is still subject to approval, IBM was unable to go into too much detail about the integration plans. Once they are, I will be looking to see whether and how they plan to use Netcool to address storage network management as a response to EMC's plans with SMARTS (which coincidentally was acquired a year ago today). I, and I am sure many of Micromuse's existing customers, will also be hoping for some clear statements about the future of a number of Micromuse's existing partnerships which take on a new complexion with IBM now in control - specifically those with BMC, CA/Concord, HP/Peregrine, Tibco from a technology perspective and Accenture from a services perspective.

And there's still two days to go before the Christmas weekend!
Tuesday, December 20, 2005

IBM snags Bowstreet for composite application development

IBM today announced the acquisition of long-time partner Bowstreet, a 75-person, privately-held company based down the road in Massachusetts from IBM's Workplace, Portal and Collaboration Software group, into which it will be assimilated (or "bluewashed" as IBM quaintly refers to it). The acquisition is hardly surprising given that the two companies have been working together for more than 4 years and share more than 100 customers.

Bowstreet began life in the heady, pioneering .COM days as a web services specialist but saw the writing on the wall with the crash and quickly refocussed its efforts on portals, and more specifically portal application development frameworks - its Portlet Factory - and a number of application solutions developed using the Portlet Factory for corporate performance and workforce management. Since then the company has worked very closely (in fact, almost exclusively) with IBM and now offers the WebSphere Portal Adapter which provides tight integration with IBM's WebSphere Portal and Rational Software Architect and as Michael George, CEO of Bowstreet so aptly put it


"Brings Microsoft ease to the power of Java" (SCA anyone?)

The two companies clearly know each other well and the technology is well integrated and as a result IBM believes that the washing process should be rapid with Bowstreet rapidly taking on a blue tint in short order. The roadmap discussed in the announcement sets out spring next year as the timeframe for re-branding and integration, with Bowstreet Portlet Factory becoming IBM WebSphere Portal Portlet Factory and the Bowstreet Corporate Performance Suite becoming part of IBM's Workplace for Business Strategy Execution.

Bowstreet has also worked with Oracle to provide a Portlet Factory optimised for the Oracle Portal (part of the Oracle Fusion Middleware "suite"). IBM and Oracle are already co-operating around WebSphere and Fusion (that's the Oracle Applications Fusion) and plan to continue co-operating on the Portlet Factory, although it seems pretty clear where resources are going to be focussed over the next few months - and it's not in the direction of Redwood Shores.

IBM positions the combination of WebSphere Portal and Portlet Factory as a solution for composite application development. Note the use of "a" and not "the". Here lies the single biggest challenge for IBM going forward. It offers multiple solutions for composite applications, including the Portlet Factory, Workplace Designer, WebSphere Integration Developer for development and WebSphere Portal, WebSphere Process Server, WebSphere ESB, Workplace for deployment. Customers are likely to confused by the array of alternatives and IBM is going to have to work hard to explain which approach makes sense when. It is not sufficient to simply state that Workplace, WebSphere Portal and Portlet Factory are for "integration at the glass" and WebSphere Integration Development, Process Server and ESB are for "integration at the backend". Customers need to know when it makes sense to do the former and how any investment they make can be exploited in the latter approach.

IBM will face similar challenges when it comes to asset integration. Bowstreet also provides a set of Integration Extensions for SAP, PeopleSoft, JD Edwards, Siebel, Documentum and a number of other applications and technologies. Customers need to know when they should these rather than the integration capabilities provided with WebSphere.

So, whilst the acquisition certainly makes a lot of sense for IBM, the company is going to have to invest in the advice and guidance to ensure it makes sense for customers too, in the context of the broader IBM SOA/composite application proposition.
Monday, December 19, 2005

Business processes and practices

Ross Mayfield's perspectives on the demise of business process and the rise of social software, which the other Neil has previously commented on, continues to spark debate. This time, fellow analyst Mike Gotta over at Burton Group enters the fray, stating:
the topic would have been better served by framing it as a process vs. work practice debate.
Mike's premise is that
the greater the elasticity of the process, the greater the variability in work practices and the greater the value social software can deliver to make that process perform more efficiently and effectively (the assumption being that with more variables "in play", social software can provide greater collective intelligence in navigating through multi-faceted, cause-effect decisions.

The notion of business process versus practices echos some of the thoughts of Ray Ozzie in this excellent Q&A over at ACMQueue:
In most major enterprises, there are formalized business processes that people understand. You have some companies that are very, very into the structured process aspects of their business and the process optimization, whether it’s Wal-Mart with the logistics or Dell, which is very focused on business processes.

On the other side of the spectrum are what I would refer to as the business practices – the more unstructured things that people do at the edge of the organization, and that’s really project-centric work. It’s really important for companies to understand how they focus and optimize their business processes and how they support their people in terms of their day-to-day practices.


Whilst I certainly agree that day-to-day business operations depend on a mixture of highly structured activities and ad-hoc, unstructured, collaborative activities, I struggle with this distinction between business processes and practices. The problem is that when we in IT talk about business processes, what we are actually referring to, as my business partner put it in his response to an earlier post by Ross, is the shadow they cast on IT:
Business processes are rich, collaborative, often unpredictable and organic. It's just that the shadow that they cast onto IT - the systems that we have built to automate parts of business processes - is highly structured and often rigid. It's dangerous to look at the shadow that business processes cast onto IT systems today, and assume that this is what business processes really look like.

Instead of this artificial distinction between business processes and practices, which I think many enterprises will struggle with, I believe it is far better to focus on business processes and recognise that they are far more sophisticated than their comparatively simple IT shadow. We discuss this in depth in our report on business process management but I shall attempt to summarise it here.

To really understand business processes and how they can be optimally supported by IT assets and services it is essential to recognise that they vary in terms of their organisational context: how they serve to differentiate the business and their "level" within the organisation - whether they support "execution", "management" or "strategy" activities. They also vary in terms of the implementation environment: the extent to which the roles within the processes receive automated support rather than being carried out by humans - role automation - and the degree to which the interactions and collaborations between roles are automated - process control automation. Straight-through trade processing in financial services, for example, has high levels of both role and process control automation, whilst the vast majority of strategy-setting processes in the majority of businesses receive very limited support through IT automation and are largely ad-hoc in terms of interaction and collaboration.

When Mike and Ray talk about business processes and practices, they are really highlighting the differences in the implementation of environment - business practices have low levels of role and process control automation but are business processes nonetheless.
Tuesday, December 13, 2005

More MWD live!

After last week's appearance by the other Neil, it is both of our turns this week! Fellow independent analyst and ZDNet contributor, Dana Gardner, over at Interarbor Solutions, hosted a 30-minute podcast with us both. We provide our thoughts on the recent launch of SCA, as well as a bit of 2005 retrospective and a look forward to 2006.

If you have 30 minutes to spare (I know, I know!) then you can download the MP3 here or, if you're an iTunes user search for "BriefingsDirect" in the podcast directory.

Expect more in 2006.
Monday, December 12, 2005

Why is there no WS-Contract?

In an earlier post I alluded to recently meeting Steve Jones, Capgemini's CTO of Application Development Transformation. We shared a panel for a VNU web seminar on SOA and ESB, and it turns out that we have pretty similar views on SOA – specifically, that the really crucial bit is about understanding the "S" and the "A".

In other words – Q: What makes SOA different from "just a bunch of web services"? A: An architectural approach. Which puts services at the centre.

Which probably doesn't mean an awful lot by itself ;-)

The crucial bit about this, though, is that services aren't interesting because they can be exposed using WSDL and interacted with using SOAP (or indeed using any other combination of readily-available interface definition language and communication protocol): they're interesting because of what they can represent. Just as OO was valuable primarily because objects could be representations of real-world objects, "SO" is valuable primarily because services can be representations of what companies do. [Note: here, I’m pretty much quoting direct from Steve's presentation.] To me, this means that SOA's potential value is really about helping IT organisations, and the businesses they serve, start to talk and collaborate using a common language.

The challenge here is that companies wanting to balance a bottom-up approach to SOA (which is about how to make software systems more interoperable through service interfaces) with a top-down approach (which seeks to identify and model how service portfolios can be managed to help align IT and business) have precious little available to them in terms of off-the-shelf tools. Some of the modelling tool vendors (IBM Rational is one example) and "enterprise architecture tool" players (Troux Technologies springs to mind) can validly say that they can help – but the truth is that most of what's going on is about modelling software structurally, even if this is done at a high level. It's not focusing with equal power on quality-of-service or commercial commitments which need to be thought about if SOA is really going to be about IT delivering "services" to the business.

A rich approach to the idea of "service contracts" is an excellent way to balance structural (functional) concerns with QoS and commercial concerns – and build bridges from requirements modelling, all the way through design, development, and deployment to system operation. But at the moment the infrastructure required to support this contract-centric approach is missing.

With this in mind, I think it's about time that the likes of IBM, Microsoft, Sun, BEA, Oracle, SAP, Borland, Compuware, CA, and Telelogic put their heads together and started thinking not only about the "bottom up" view of SOA – but also about how to promote standard approaches to help people with the "top down" view of SOA that really drives value. They could start by talking to Bertrand Meyer

As Steve put it: "Why isn't there a WS-Contract initiative?"
Friday, December 09, 2005

MWD live!

For anyone who's interested in SOA, I thought you might find this interesting.

I present first, and most of what I'm talking about is the four steps involved in maximising the business value of SOA. I then talk a bit about the technology required to make these steps - which is also covered in our recent SOA quality management report.

It was great fun to do the event, and I also got to meet Steve Jones, another very smart guy from Capgemini. Where do they get them?
Wednesday, December 07, 2005

With SCA, reality bites J2EE again – but is that the whole story?

[Note to readers: this is a long entry. Please bear with it!]

With the announcement last week by IBM, BEA, Oracle, SAP, Siebel, IONA and others that they are collaborating to develop a language neutral programming model tuned to the needs of SOA initiatives, it looks like a little more lustre has rubbed off J2EE. But it also looks a little like something deeper could be going on: the biggest vendors are shifting their attention to a wider market opportunity. Can they avoid the mistakes of J2EE?

J2EE was originally designed with serious input from IBM (based on its ComponentBroker middleware project), BEA and others – as well as Sun – to standardise a programming model for building web-based business applications. It arrived in the dot.com boom, when the world was awash with small innovative suppliers selling middleware which combined web servers with support for object-oriented programming, and transaction handling against remote databases; and outwardly at least, the idea was a good one – the argument from the authors was that without a clear standard in the market, the cost of developing skills and the risk of vendor lock-in would hold the market back.

But for some years now voices in the software development community briefing against the suitability of J2EE for various types of work have been getting louder.

"If it ain’t J2EE, it ain’t going to do the job"

I’m happy to admit that there’s long been a conspiracy theorist living in my brain, telling me that J2EE was an attempt by the established players (IBM, BEA and Oracle, aided by Sun) to lock small vendors out from the Java application server market opportunity. For most of the time the primary "evidence" to support my feverish imaginings was the fact that certifying products to J2EE has always been expensive – and it has become more expensive as the specification has become more complicated. Small vendors had real trouble getting the resources together to play that game. Now, though, I’m starting to think that the increasingly audible developer discontent with J2EE adds fuel to my fire.

It is entirely possible that J2EE was developed altruistically by the folks involved in the process. However good the original intention was, though, the large vendors’ sales and marketing teams have certainly been happy to associate complete J2EE compliance with the ability to deal with real-world requirements in customers’ minds (see here for just one example).

Innovation always finds a way

The shine on J2EE started to dull when a core element of the Enterprise Java Beans (EJB) element of the J2EE programming model, Entity Beans, started to be seriously tested by developers – and many declared it to be too clumsy and inefficient to provide the facilities which were advertised (again, see here for an example). Curiously, given its stated intent, as well as its shortcomings in helping developers manage persistent data, the J2EE model also never really offered design patterns to help people building web-based applications become productive. More recently, the specification has received brickbats for failing to really get to grips with the challenges of developing and deploying web services.

On top of this, J2EE is managed by the Sun-led Java Community Process (JCP), which has, in all core Java-related areas, eschewed the creation of royalty-free specifications. This is combined with the fact that as J2EE has moved through a handful of major revisions the specification has only become more difficult and expensive to certify against.

More and more Java development shops have been looking to (open-source, royalty-free) frameworks like Hibernate, Spring, Tapestry and latterly Struts – supported by "professional open source" vendor JBoss and vendors with "blended" approaches like BEA – to help them build the kinds of applications that the J2EE specification was originally designed to help with. The JCP has learned some lessons from these frameworks in the new Java Server Faces (JSF) and EJB 3.0 specifications, but the alternatives continue to thrive.

And at the same time as all these J2EE-related shenanigans have been going on, Microsoft has been steadily plugging away at its own .NET programming model – introducing features to hide the muckiness of dealing with web services and various proprietary integration technologies; offering a variety of ways of processing and managing persistent data; as well as making it relatively stress-free to develop client-server applications which can seamlessly work with or without network connectivity.

If you want to a winner, you can’t only champion Java

Moreover, IBM and BEA have done little to hide their antipathy regarding the JCP. Both abstained from voting on a recent specification proposal concerning the creation of a standard Java integration framework (JSR 208, for anyone interested ;-). IBM has taken to adding a statement (see here for an example - scroll down in the comments box) to all its JCP submissions to the effect that it disagrees with the JCP’s licensing model. And IBM and BEA have been collaborating heavily, along with a number of other very significant vendors, on two new programming model specifications: Service Component Architecture (SCA) and related Service Data Objects (SDO).

The technical details of SCA and SDO (such as they are, at this early stage) clearly borrow from the innovations of the open-source Java frameworks – as well as from Microsoft’s own programming model innovations in .NET 2.0, the Windows Communication Framework, ADO.NET and LINQ. Most importantly, however, they are explicitly "multi-language". Support for Java is obviously important, but Java is just one language that the project will target: the other "first class citizens" are C++ and PHP.

For IBM and BEA watchers, this move is very significant. Both companies are betting a lot on SOA being a Big Thing – and the truth is that SOA is about integration first and foremost – irrespective of runtime environment or programming language. A SOA technology proposition based solely on Java is a three-legged dog in what will turn out to be a fiercely-contested race. Both IBM and BEA have some serious technical and marketing work to do in this regard, as they have spent many years and hundreds of millions of dollars convincing the world that they are synonymous with “enterprise Java”. BEA’s acquisition of the language-neutral Plumtree was a good step forward; and IBM is constantly being reminded of the need to something by its Global Services outfit, which makes a lot of money away from the Java world. IBM and BEA’s roles in the development of SCA and SDO are clearly there to support the companies’ moves away from Java-centricity.

For J2EE watchers, the introduction of SCA and SDO is also of paramount importance: it’s yet another sign that J2EE is increasingly seen by many of its big vendor champions as just one possible platform design which will work for customers looking to build and integrate modern business applications. In my opinion, it’s not before time.

SCA: the next J2EE?

The SCA/SDO programming model initiative has a lot of potential to help development shops in the software community, as well as in other industries, get to grips with the challenges that SOA brings. But there are some challenges that the promoting group will have to overcome if it is to really serve the needs of customers – and if it is to avoid the slide into disillusionment that software vendors and enterprises are now experiencing with J2EE.

The group says it’s committed to developing open and royalty-free specifications. But will it also work to keep specifications simple, so as to ensure that a wide range of participants can implement them?

When and how will the group transfer the specifications to an independent standards body? This question is crucial and also tricky. There are many who say that the JCP is too quick to adopt new specifications – leading to endorsements of specifications which aren’t fully road-tested in industry. Standardisation of a technology has to both take input from commercial experience from technology innovators and early adopters; as well as making the technology appealing to the larger population of mainstream adopters.

The philosophy is language-neutral. But will the group make explicit provisions for the addition of new language bindings to SCA and SDO by third parties that may come along later, and how easy will this be?

If SCA and SDO are set up to compete against Microsoft’s programming model initiatives, the sponsoring group needs to ensure that they make solid, compelling development tools available quickly. Unless progress is swift they will leave potential customers with a complex set of specifications which are difficult to implement.


Burn this feed
Burn this feed!

Creative Commons License
This work is licensed under a Creative Commons License.

Blog home

Previous posts

Normal service will be resumed shortly
Links for 2009-07-02 [del.icio.us]
Seven elements of Cloud value: public vs private
The seven elements of Cloud computing's value
Links for 2009-06-09 [del.icio.us]
Links for 2009-06-02 [del.icio.us]
Links for 2009-05-27 [del.icio.us]
Links for 2009-05-20 [del.icio.us]
Micro Focus gobbles Borland, Compuware assets
Links for 2009-05-05 [del.icio.us]

Blog archive

March 2005
April 2005
May 2005
June 2005
July 2005
August 2005
September 2005
October 2005
November 2005
December 2005
January 2006
February 2006
March 2006
April 2006
May 2006
June 2006
July 2006
August 2006
September 2006
October 2006
November 2006
December 2006
January 2007
February 2007
March 2007
April 2007
May 2007
June 2007
July 2007
August 2007
September 2007
October 2007
November 2007
December 2007
January 2008
February 2008
March 2008
April 2008
May 2008
June 2008
July 2008
August 2008
September 2008
October 2008
November 2008
December 2008
January 2009
February 2009
March 2009
April 2009
May 2009
June 2009
July 2009

Blogroll

Andrew McAfee
Andy Updegrove
Bob Sutor
Dare Obasanjo
Dave Orchard
Digital Identity
Don Box
Fred Chong's WebBlog
Inside Architecture
Irving Wladawsky-Berger
James Governor
Jon Udell
Kim Cameron
Nicholas Carr
Planet Identity
Radovan Janecek
Sandy Kemsley
Service Architecture - SOA
Todd Biske: Outside the Box

Powered by Blogger

Weblog Commenting and Trackback by HaloScan.com

Enter your email address to subscribe to updates:

Delivered by FeedBurner