advising on IT-business alignment
IT-business alignment about us blog our services articles & reports resources your profile exposure
blog
blog
Friday, February 24, 2006

Ecosystem vs egosystem

A brief thought for a Friday.

I was at a briefing with some IBM folk the other day, and I was listening to Karla Norsworthy talking about how IBM works to influence industry and technology standardisation. At one point she said "ecosystem" - but the combination of her American accent and my poor listening skills made me hear "egosystem".

Of course it wasn't long before I thought of the blogosphere - it's kind of the ultimate egosystem. On bad days the blogosphere seems to me like the mother of all wine-and-cheese parties from hell - thousands of people in an aircraft hangar with their chests puffed out, saying "blah blah blah" loudly to no-one in particular. Some are more tipsy than others. (Look here for another, more provocative, take on the broader subject).

And of course I then got to thinking about how a lot of the interest in blogging today is about how to marry the egosystem with the business ecosystem - how to leverage blogging and bloggers for revenue, margin, profit, mindshare, whatever. So on the one hand we've got the business ecosystem - where success and failure are based on meritocracy and objective measurement - and on the other we've got the egosystem, where getting ahead is all about who you know and how many people you can suck up to.

And finally, within the space of a couple of minutes, I realised: actually, there's no difference at all;-)

In business, and as advisors corporate IT, it's too easy for us to kid ourselves that businesses are machines which can be mechanistically optimised. The blogosphere reminds us that underneath it all, it's egos that run the machines.
Wednesday, February 22, 2006

More on the 'P' word - back to basics

Following up on Neil's post about policy, I thought it was making a single point about whether or not the concept of "policy" could be in or out of vogue. Its a simple enough point, but one that took me years to work out - that a policy defines a set of rules. However the term "policy" is defined for whatever situation, but this principle should underly any definition.

So, for example, the statements "It's not business policy to allow that" or "we have a data retention policy of 30 days" both map back to simple rules that determine whether the policy is to be applied or not - then what to do about it.

I don't want to underestimate the complexity of business situations, but I do want to acknowledge that the building blocks, the central concepts are quite straightforward. I believe there are four fundamental concepts in any business or IT situation, all of which are related:

- policy - sets of rules to be applied by the business
- entity - what the business sees as the "things" it deals with
- service - things that can be requested of the business
- process - concatenations of activities to support service delivery

(there might be 5 - if you want to add state - but this is also about the relationship between entity and process so it can be derived from the above)

We can dress these up in whichever terminology is in vogue, but to say they should or shouldn’t exist is tantamount to saying that atomic particles shouldn’t exist - they'll still be there, whether we want them to be or not!

Given that enterprise IT exists purely to automate business models (another simple, yet fundamental principle), it follows that there are IT equivalents of the above, and that they are mappable one to the other. Any shortfalls between the mappings are more an indication of our failure to deliver IT in a way business needs it, than anything.
Tuesday, February 21, 2006

Insight on information security - well worth a read

The other Neil alerted me to Security Incite, a fellow specialist analyst company, founded by Mike Rothman (former META analyst, PKI entrepreneur and marketing VP at CipherTrust and TruSecure) and which is focussed on the information security market. The company has an innovative community-driven approach to working with technology adopters but that's not what I want to discuss here.

I wanted to call out Mike's recent post which defines a pragmatic segmentation of the confusing world of information security. I thoroughly endorse his approach in providing the structure that IT buyers need to help them make effective security investment decisions and to understand how all the pieces fit together.

Also, I can empathise with his motivations: he needed to go through the process to make sense of it himself. I have gone through a similar process in my investigations of just one area of Mike's model: identity. In fact, as will become apparent in our soon-to-be-released report on identity management, there are strong parallels between Mike's analysis of the whole area and my perspective on identity management architecture. It's about a clear separation of concerns - infrastructure security, information security, identity, policies and reporting in Mike's case and identity data sources, identity and access services, policies and lifecycle management in mine.

As Mike drills into each of his areas, it will be interesting to see whether he identifies a similar set of capabilities: repositories, security services delivered as infrastructure, policy-based management and monitoring and security lifecycle management.

Beware the 'P' word

Yesterday was a day out of the office. The other Neil and I were at briefings with the VP of Product Development of an ESB company and then the President/CEO of a service-oriented management company. Whilst the focus of their employers' respective offerings is clearly different, there was one theme that came across loud and clear in their respective pitches: policy.

Unsurprisingly - these were SOA-related briefings after all - interoperability and standards support were equally strong themes and, more specifically in this regard, WS-Policy. They were both describing the use of WS-Policy to define operational requirements - security, load balancing, transformation etc. - which can be enforced by their own solutions and other components of the SOA infrastructure.

Once back here in The Fens, I began trawling through the list of unread blog posts when I came across this from Ian Glazer at Trusted Network Technologies (via the very useful Planet Identity feed - thanks Pat). Ian calls out a post from Sara Gates', Sun's VP of Identity Management reflecting on the recent RSA Conference calling for a moratorium on the use of the 'p' word because:
It’s become a bad word in that the word “policy” in the technology arena has so many meanings that it has actually become meaningless. “Policy” means a lot of things, all of them ultimately in a business, and often, security context. A policy can be on data protection, a policy can be on access control in the platform or application, a policy can be in a dusty three-ring binder that no one ever uses, a policy can be made in response to a law or regulation
Ian goes on to say that
The Identity lexicon is a strange one. We use words that have multiple meanings. We use terms to hide the realities of market segments. Policy is definitely high on the list of overused and under-defined terms.
and cites as evidence the proliferation of policy management interfaces within identity management solutions:
I spent last week asking a variety of vendors how many different policy management interfaces they have for their products. I think the average for a decent sized identity management vendor is around 5. (One vendor told me of over 10 different policy management interfaces for their suite of products.) Customers are being overwhelmed with different policy tools. Multiple policy management interfaces from multiple vendors.
Don't get me wrong. Here at MWD we are strong advocates of policy-based approaches, as anybody who is familiar with our views on SOA will be aware:
By encouraging openness, flexibility and reuse, a service-oriented approach guarantees that we cannot know in advance which consumers might request which services. We cannot know what kinds of obligations might need to be fulfilled, until a request is made. The way to handle this uncertainty, which is sure to arise as service portfolios expand and become more complex, is to use the design concept of “policy” to dictate the conditions which must exist for the contract between a consumer and a provider to be fulfilled.
As IT organisations strive to break down the application and infrastructure stovepipes which are constraining their businesses and move towards a distributed, virtualised, heterogeneous architecture, their ability to define business-meaningful policies which can be enforced consistently throughout the fabric of the 'next-generation data centre' will be critical. I agree completely with Sara when she says:
How about a moratorium on the “P” word unless it is modified with a precise, readable explanation of what we mean?
I would go further to add 'consistent' to her list of adjectives. Customers are currently suffering from a case of policy overload. Vendors operating across broad swathes of the technology landscape - application lifecycle management, information lifecycle management, IT governance, IT service management, service infrastructure etc etc - are promoting policy-based approaches. But can they interoperate? Ahh - but surely that's where WS-Policy comes in.

Let's be clear. First of all, WS-Policy has not been submitted to a standards body.

Secondly, as the authors (BEA, IBM, Microsoft, SAP, Sonic and VeriSign) point out:
WS-Policy provides a general purpose model and syntax to describe and communicate the policies of a Web service.

The Web Services Policy Framework (WS-Policy) provides a general purpose model and corresponding syntax to describe and communicate the policies of a Web service. WS-Policy defines a base set of constructs that can be used and extended by other Web services specifications to describe a broad range of service requirements, preferences, and capabilities.
The key phrases here are "general purpose" and "can be used and extended by other specifications to describe a broad range of service requirements, preferences, and capabilities" (WS-SecurityPolicy is one example of such an extension, focussed on policy assertions for web services security). In other words, WS-Policy does not deal with semantics: it provides a framework within which those semantics can be defined. Support for WS-Policy provides no guarantee that the way one vendor defines a particular policy can be interpreted and enforced effectively by another. That will require agreement on semantics. It's not going to be easy! It will require the participation and cooperation of vendors of all shapes and sizes. Vendors, moreover, who are going to have to relinquish the control that ownership of policy definition can provide.

We will certainly continue to highlight this issue in our discussions with vendors but I am not naive enough to believe that the opinions of analysts carry the same weight as those of customers and prospects holding IT budgets. I am hopeful that as the 'p' word is raised in vendor pitches some tough questioning ensues.
Saturday, February 18, 2006

Nick Carr isn't always right - but ignore him at your peril

I'm obviously only a yellow-belt blogger. Reading Dan Farber's post today over at ZDNet about reaction to Nick Carr’s presentation at OSBC reminded me that I'd seen Nick Carr present almost exactly the same deck at Infoconomy's Effective IT Summit in Cardiff, Wales a couple of weeks ago (where I was lucky enough to be invited as an "expert witness"… whatever that is ;-)) and had meant to blog on it after – but had entirely forgotten.

My reaction to seeing Nick then was the same as on reading Steve O'Grady's and Phil Wainewright's takes just now. I love reading Nick Carr's stuff. He’s really smart. He's not always right – but then who is?

At the Cardiff conference there was quite a prolonged Q&A after Nick's keynote talk. Lots of people (many of them representing vendors, but some IT directors too) received it pretty coldly, and didn’t hold back from saying that he was plain wrong. IT is completely different from electricity, they said – and yes, they're right - Phil Wainewright nails part of why Nick's thesis is a bit out of whack here.

There's another reason that Nick is a bit off base, as I told him then (I managed to squeeze the last question in – and as any IT industry analyst will recognise, managed to state my opinion rather than asking a question ;-). It's that his view appears to be binary: you either "generate locally", or you "use a utility". The truth is that organisations can and should think about this question at a more granular level, focusing on business processes. Interestingly Nick himself kind of recognised this in his talk, by saying "increasingly, innovative organisations will make multiple sourcing decisions rather than one". The truth is that technology and business process maturity now mean that organisations aren’t best seen as "innovators", or "laggards" or whatever. They are a mixture of all of those. The key is to think about which parts of the organisation are about innovation – and to retain "local generation" in those parts; while moving towards a utility model of provision, as far as it can be attained, only in those parts which are naturally less needy of technology to drive competitive differentiation. Our reports on BPM and SOA explore this in more detail.

The main point of this blog though is that however wrong you might think Nick Carr is, his ideas mustn't be ignored. Why? Because his views very neatly, in my experience, represent the thoughts and concerns of the businesspeople who pay every IT salary out there in industry. Nick has a business brain, speaks about commerce and economics, and resonates brilliantly with businesspeople. Many IT geeks distrust him. And there's the rub. Any IT professional who wants to improve the relationship between IT and business out there has to work out how to clearly articulate counter-arguments to Nick's thesis, rather than dismissing him. That's tantamount to saying "users just don't get it".
Thursday, February 16, 2006

Microsoft vs EC - adequate response, but who remembers the question?

I went to a very interesting briefing with Microsoft this morning, following up on the EC case documented here. Essentially, to comply with EC requirements the company has documented parts of its Windows Server code that were previously invisible to the outside world, and made this documentation available to its source code licensees.

But, does it really matter? The central question is supposed to be, “Has Microsoft done enough to convince the EC that has provided an adequately documented code base to interested parties?" From a developer's perspective, previously inaccessible elements of code are now as accessible to licensees, as code that was already available to MSDN subscribers. Documentation exists as Windows help files and is therefore as usable as other Windows help files, and the source is searchable and accessible through a simple but adequate browser-based system. To say that these provisions are insufficient, would be tantamount to saying that the Windows help system and the browser were inadequate mechanisms for text-based information access. Someone will have to spend the time to ensure that all the code is documented, and while they may identify weaknesses in the existing documentation, it is unlikely they will find too many gaps.

In other words, Microsoft’s efforts should be good enough for most developers that have the wherewithal to understand the code, and therefore they are good enough for me.

There remain some open issues, notably around the licensing model itself - whether it is a workable framework for the competitors in the open source community, for example. Microsoft has made some efforts in this direction, but does stipulate that its code must remain private. Clearly, Microsoft is trying to balance open-ness with the need to protect its IP, but some may question whether it is going far enough.

Whatever the outcome, however, it becomes almost irrelevant, when compared to the real questions underlying the debate. Microsoft was instructed to open up its code to combat accusations of anticompetitive practices, and of abusing its pseudo-monopolistic position with desktops and departmental servers. The inordinately long and slow legal process links right back to the MS vs Netscape antitrust cases in the US. The trouble is, whatever efforts Microsoft makes to open up its Windows Server code in the here and now, does not guarantee that confidence in the company’s desire to “play fair” will be restored. Most importantly, however, the move won’t make a jot of difference to Microsoft's ability to compete.

There are a number of reasons for this. First, the scope is limited to Windows Server and doesn’t cover more current areas of direct competition – Microsoft Exchange, for example, has until recently locked out any direct connections with devices not running Microsoft software. Was this anti-competitive? You betcha. There are other examples – in the past we had C# and J# undermining C++ and Java, and today, Microsoft’s virtualisation engine should be subjected to far more scrutiny. In a bizarre twist, Microsoft itself is being prevented from developing features that should clearly be tied into the very heart of the operating system – antivirus is one example, where these old legal battles are preventing legitimate innovation.

It’s the old adage, “if you want to get there, don’t start from here.” The legal system unable to keep up with a rapidly changing industry, so by the time anyone works out that company X is being uncompetitive to company Y, the world has already moved on and company Z has come from nowhere. In the midst of the browser wars, who expected such developments as Firefox or openoffice.org, or indeed Linux? There are plenty of other examples of where Microsoft doesn't have a monopoly, from gaming to Symbian. Meanwhile, just as Bill Gates once predicted when talking about his fears, competition has come from left field, with the overvalued, industry darling Google establishing itself as a strong competitor and putting paid to the idea that Microsoft would take over the world. It seems laughable now, but there were plenty of people who believed it. Microsoft is the number 3 in the market which is likely to see significant growth and innovation, and all bets are off as to who will dominate. Meanwhile Microsoft is using its position to push new technologies – Vista and Office12 for example, or the small business suites from the likes of Navision and Great Plains – in ways that the EC may at some point in the future decide are anticompetitive. By then, however, it will be too late to do anything but follow through again with some ill-considered rearguard action.

Can anything be done at all? It is difficult to say, but the problem lies in the legislation, not the vendors. I would be looking at how software is imported into the EU, and considering import criteria on Microsoft that any new subsystems would require open interfaces that enable them to be swapped out and replaced with those of a competitor. The good news is, this is the way the world is going anyway, and Microsoft is following suit. No enterprise organisation will ever follow a floor to ceiling Microsoft model, and in the service based world, the historic walls between applications and software services are being forced open. Microsoft knows that it is in its interest to open up a little, if it wants to stay in the game. Technology constraints enabled Microsoft to get to where it was in the first place, but those constraints are no longer applicable. Even if Microsoft was anti-competitive in the past or abused its unique position, it is exceedingly unlikely that it would ever be able to do the same again, and any legal battle to rein it in or prevent it from doing so becomes less and less relevant.
Tuesday, February 14, 2006

It must be that time again... more software announcements from HP

HP's troubled relationship with middleware technology has been well-documented (see here and here for examples). Now, after a relatively quiet period HP is making a flurry of announcements about new deals. Hot on the heels of the acquisition of archiving software vendor OuterBay last week, it's announced today that it's deepening its middleware reseller relationship with Oracle. From HP's own press release:
HP plans to develop and deliver SOA-based business services that help customers automate business processes and integrate disparate data and applications, while leveraging existing applications.

The challenge for HP in playing the SOA card has long been that most enterprises are still coming at SOA from a development/integration perspective – where HP has little in the way of capability to directly support them. This means it needs to partner with the likes of Oracle, Microsoft, and BEA. And of course correspondingly, the challenge for Oracle in pitching SOA is that customers deploying Fusion Middleware will ultimately need to manage the applications they build.

On paper it makes sense then – but from HP's initial communication about this partnership, it's difficult to see why customers interested in SOA and Oracle's technology would turn to HP, which seems to just be pitching its installation services into the mix and missing the opportunity to promote its own IT service management capabilities in the context of SOA projects.

I wonder if this is a case of left-hand-doesn't-know-what-right-hand's-doing...
Sunday, February 12, 2006

HP and Outerbay - packing the storage portfolio

Jon Collins here. Having recently joined MWD as a principal analyst, I was asked what I thought about the Outerbay acquisition. Good question. From where I’ve been sitting, HP has never had the best track record in enterprise software. The next “e” never was e-services, and HP has acquired many a product only to let it wither on the vine – Bluestone was a prime example. Meanwhile the company has struggled to bring together its OpenView service management portfolio with Compaq’s Insight Manager systems management tools, to present a coherent portfolio. Is it any wonder then, that we’ve seen considerable erosion of HP’s credibility as it tries to sell software into the enterprise.

All is not doom and gloom, however. Last year, HP started showing its roadmap to bring together its storage management portfolio, a roadmap which demonstrated both an understanding of enterprise customers’ storage needs, and the wherewithal to do something about them. The keyword is “integration” – to provide tools that can do more than manage storage components, HP understands the need to see storage management from end to end, across the range of potential platforms, from the creation and referencing of information, through to its archiving and ultimate deletion.

While parts of HP may have “got” this in the past, it appears to now get it as a company, or at least across the storage division. The exact term applied to this view does not matter much to me; the term currently in vogue is Information Lifecycle Management, or ILM. HP has chosen to announce this week’s acquisition of OuterBay under the “ILM” banner, albeit that OuterBay only meets a subset of the requirements of end to end storage management – that is, it only deals with structured information as used by databases. It also focuses mainly on archiving in a tiered storage environment, rather than any vision of truly dynamic data movement. The important thing from this perspective, is that HP understands what it wants its storage management offering to look like, and it has plugged a gap.

To really add value, HP’s task is now to integrate OuterBay’s capabilities into its existing storage toolset. In this way the acquisition can succeed where others have failed – this time, it is not about capturing a tactical market (though that may have aided the decision). The press release states that HP’s existing customers are asking for the database archiving capability, which I believe is a good indicator that they are on the right track.

Open questions remain – notably the impact on EMC hardware customers that currently use OuterBay. HP hopes this situation can remain, and indeed, EMC has had to take an increasingly philosophical stance to partnering, following its own acquisitions of companies like Legato, VMware and Smarts. Ultimately, I expect both EMC and HP to plan to migrate their own customers to their own hardware and software portfolios, but it is in nobody’s interest (least of all the customer) to rock the boat unnecessarily.

Second, HP needs to face the fact that ILM solutions should be less expensive than the traditional “pile-em-high” approach to storage. HP sales execs will have the choice of selling another bunch of disks, or offering a more cost effective alternative based on software. When an expenses-paid trip to Bermuda (because they hit their sales targets) may ride on the decision, the stakes are pretty high. There are various statistics about the amount of spare storage capacity in enterprise environments. Is HP really willing to take the hardware hit, should it become successful in selling the software?

Finally, HP still needs to sort out its marketing at a strategic level. As an engineering company HP has never been very good at marketing, and somehow the integrated HP/Compaq managed to adopt the HP approach. HP may want us to think so, but in reality there’s currently no such thing as an “adaptive enterprise,” as anyone who works in a large organisation (think: every HP employee) should know. It would be wise for HP to bring its messaging a bit more down to earth. I believe a good place to start is with storage management and ILM, which is all about aligning resources with business needs. We might not be there yet, but get this right and HP may indeed bring the adaptive enterprise one step closer.
Thursday, February 02, 2006

Mashups: VBAD, not SOA

I was reading "Rethinking BPM in a mashup based SOA world" – which brings together ideas from some great bloggers: the author David Berlind, Steven O'Grady, and Sandy Kemsley – and I completely see why the mashup phenomenon is potentially interesting in an enterprise IT context.

But I think it’s dangerous to associate mashups and SOA too closely. Why? Because the real value of SOA (and I realise I’m starting to say this an awful lot) comes principally from thinking about the "S" and the "A". The "S" is about more than just components-at-a-distance – "service" implies not just a functional commitment but operational quality and financial commitments too. The word "service" as part of the term "web service" is causing just as much harm than good in this respect. And the "A" is about a process of designing and managing a portfolio of technology assets to balance a client’s short-term and long-term requirements. Mashups don’t really get involved in either the "S" or the "A". In fact they’re almost the antithesis of these SOA principles.

Nope, mashups are much more about VBAD – Visual Basic at-a-distance.

Which is not to say they have no value. They’re a really interesting form of composite application development. And I was a developer back when VBXs were all the rage: so I remember the value that both Redmond and IT users got out of VB. Of course, as David Berlind points out at the end of his blog:
Once business managers start developing applications, who is going to manage and support all that code?


UPDATE: Neil Macehiter just alerted me to this post at ZDNet by Dion Hinchcliffe – it's a really interesting report on what Microsoft is attempting to do in fusing some Web 2.0 concepts and its business-focused "Live" software-as-service offerings. But like so many other commentators he draws a bold line between Web 2.0 composition and SOA. He's right that Web 2.0 concepts have parallels in the use of SOAP, WSDL etc for application integration – but as I'll argue 'til I'm blue in the face, this is just one element of SOA; and it's not even a necessary element.
Wednesday, February 01, 2006

The Vista business proposition - or lack of it

Prompted by the coverage (here at News.com for example) of Jim Allchin's recent "Vista" tour of US press and analysts, I got to thinking about the business proposition for the next release of the Windows operating system - or lack of it. To date, Microsoft has focussed its messaging at its natural audiences: the developer (with WCF, WPF, WWF...) and the IT Pro (with a raft of security enhancements, simplified deployment ...). However, with the IT budget holders still maintaining a firm grip on the purse strings, Microsoft really must begin talking to other key stakeholders.

With our focus on IT-business alignment here at MWD, I believe the Redmond marketers need to explain Vista capabilities in terms of the business outcomes they enable (and to do so pretty sharpish, given that deployment is likely to take at least 18 months following any decision to go ahead). They should begin with the largely consumer-oriented messaging around the three pillars of Vista:

Confident - Security & Privacy (UAP, anti-spyware), Deployment & Servicing (image management, event logging), Performance (smart caching, disk optimisation) and Reliability (enhanced crash analysis and diagnostics, hardware monitoring)

Clear - Visualisation (3-D UI, sidebar), Browse, Search, Subscribe (tabbed browsing, shrink-to-fit printing), Information Management (quick search, virtual folders) and Photos, Music and Media (picture and video library, DVD burning)

Connected - Any Time, Any Place (wireless networking, application firewalls), Systems and Software (peer-to-peer platform, web services), Devices (auxiliary displays, universal syncing), People (ad-hoc workgroups, network projection)

and refine them in terms that will appeal more directly to business decision makers (plus the systems integrators and managed desktop service providers acting as a proxy for those decision makers).

This is not going to be easy. There is not a direct correlation between the capabilities that Microsoft associates with each of the pillars and business outcomes, just as there is not a direct mapping between stovepiped technologies and what actually happens in the day-to-day activities of most businesses. For example, Microsoft should couch capabilities such as peer-to-peer platform, the creation of ad-hoc workgroups and mobility enhancements in terms of enabling organisations to transition from employee to organisational productivity through enhanced collaboration. Similarly, combine Windows Workflow Foundation, web services and search enhancements and position them as enablers of process-based, rather than task-based, automation (particularly for the more ad-hoc, collaborative aspects of business processes which are dependent on unstructured information).

I realise I have only scratched the surface but I think it highlights that Microsoft has some way to go if it is to persuade businesses to make the significant investment associated with a large scale desktop OS migration.


Burn this feed
Burn this feed!

Creative Commons License
This work is licensed under a Creative Commons License.

Blog home

Previous posts

Normal service will be resumed shortly
Links for 2009-07-02 [del.icio.us]
Seven elements of Cloud value: public vs private
The seven elements of Cloud computing's value
Links for 2009-06-09 [del.icio.us]
Links for 2009-06-02 [del.icio.us]
Links for 2009-05-27 [del.icio.us]
Links for 2009-05-20 [del.icio.us]
Micro Focus gobbles Borland, Compuware assets
Links for 2009-05-05 [del.icio.us]

Blog archive

March 2005
April 2005
May 2005
June 2005
July 2005
August 2005
September 2005
October 2005
November 2005
December 2005
January 2006
February 2006
March 2006
April 2006
May 2006
June 2006
July 2006
August 2006
September 2006
October 2006
November 2006
December 2006
January 2007
February 2007
March 2007
April 2007
May 2007
June 2007
July 2007
August 2007
September 2007
October 2007
November 2007
December 2007
January 2008
February 2008
March 2008
April 2008
May 2008
June 2008
July 2008
August 2008
September 2008
October 2008
November 2008
December 2008
January 2009
February 2009
March 2009
April 2009
May 2009
June 2009
July 2009

Blogroll

Andrew McAfee
Andy Updegrove
Bob Sutor
Dare Obasanjo
Dave Orchard
Digital Identity
Don Box
Fred Chong's WebBlog
Inside Architecture
Irving Wladawsky-Berger
James Governor
Jon Udell
Kim Cameron
Nicholas Carr
Planet Identity
Radovan Janecek
Sandy Kemsley
Service Architecture - SOA
Todd Biske: Outside the Box

Powered by Blogger

Weblog Commenting and Trackback by HaloScan.com

Enter your email address to subscribe to updates:

Delivered by FeedBurner