advising on IT-business alignment
IT-business alignment about us blog our services articles & reports resources your profile exposure
blog
blog
Thursday, April 27, 2006

Microsoft and management - steady as she goes

If there’s one thing Americans know how to do, its conventions. It seems that every town has its venue, and every association worth its salt will organise an event at least annually. In San Diego this week, Microsoft’s management summit cohabited the downtown convention center (sic) with the Society of Realtors and the American Association of Airport Executives. Take away the logos and defocus the eyes a little, and it becomes almost impossible to tell the difference.

It is surprising then, as I sip a beer at the airport on my way home from MMS, that I feel strangely puzzled. Everything was there – the executives bounding onto the stage for the keynotes, the hands on sessions with the experts, the exhibitors and their stands, the embroidered shirts and the giveaways. All the same, something didn’t feel quite right. After four days steeped in the place, the technologies and the messages, I think I’ve worked out what it was.

Microsoft management is in transition. From being a supplier of tools to help support its own products, which begat Systems Management Server (SMS) and then Microsoft Operations Manager (MOM), the company is moving towards supporting a more holistic view of management. This much is obvious – we’ve all heard about the Dynamic Systems Initiative (DSI) and Microsoft’s desire to manage heterogeneous (a.k.a. Microsoft plus Linux) environments. At the conference Microsoft announced acquisitions such as AssetMetrix, aimed at rounding out the existing offering, in this case, software licensing. "Note that there is no 'Buy Now' button," said Bill Anderson in the demo of how this would be integrated. "We're not that evil!" quipped Kirill Tatarinov).

This is a long term play, and its going to take time. What’s less obvious is how this impacts Microsoft’s core customers for management products. While the company has made progress since this time last year, Microsoft freely admits that it is unlikely to be able to tackle the “big four” enterprise management framework players – IBM, HP, CA and BMC – head on, by poaching their customers. Instead, Microsoft is initially looking to provide a more comprehensive offering for existing users of Microsoft management tools – potentially, but not exclusively, in conjunction with the enterprise frameworks.

As a company, Microsoft is looking to up the ante, to claim a seat at the table of enterprise management. It isn’t yet ready to do so: the gaps in its portfolio can’t be filled by announcements of future products, however committed it is to delivering on its roadmap. While Microsoft has a long-standing relationship with ITIL, this is largely to be found in its proscriptive advice (as the Microsoft Operations Framework, MOF) rather than being reflected in the products themselves.

In the short term at least, Microsoft’s customers will see little difference other than a rebranding of SMS and MOM to come under the “System Center” moniker. Customers will wait at least a year for the third pillar of System Center, the planned service desk, complete with configuration management database and ITIL-oriented workflows. Here’s the paradox: while Microsoft is talking the talk of enterprise management, it’s not yet walking the walk. More importantly, neither should it, not in the short term.

For the big four, “the walk” has often resulted in over-expensive, over-complex frameworks that have never quite delivered on their potential. In addition, such products, and the companies that provide them, depend quite heavily on services to supprt their implementation, something Microsoft has neither the desire nor the current capability to replicate. At the San Diego conference there were three thousand attendees, already versed in SMS and MOM. Microsoft has built its management business from the ground up, delivering practical tools to solve real problems. Whatever the company delivers over the next 18 months, it cannot afford to alienate its existing customer base – any new offerings will need to be as practical and grounded as the tools they replace, as it is the existing customers that are the most likely buyers of Microsoft’s next generation products.

It may be sorely tempting for Microsoft to paint a visionary picture of management, to pitch DSI alongside HP’s adaptive infrastructure or IBM’s autonomic computing. To do so would be folly, as these visions have largely been proven as guff. IBM is now distancing itself from matters autonomic , and HP is downgrading its hyperbolae to refer to the “adaptive infrastructure”. Microsoft's areas of strength are in the mid-market and in the Microsoft-centric shops in larger enterprises. While the goal may be heterogeneity, if it wants to retain its existing customers, Microsoft does not have the luxury of treating DSI purely as a marketing campaign. It has to deliver real results, to real people, solving real problems, or it will be perceived as a failure.

There’s the rub: Microsoft needs to present itself as “player”, all the while keeping its feet on the ground. The message may be big, but the audience is currently small, and that’s the way it should be, for now. A few years ago, at another user conference from another vendor, I was talking to a customer just after the keynote. “Those guys out there,” he said, indicating the execs who had just been speaking, “have no idea what its really like inside a data center.” Microsoft may not yet have a seat at the table, nor does it yet have the wherewithal to offer enterprise management across the board, but it does have the benefit of tangible, practical experience in its own domain. By working from the ground up and linking DSI to solving the real problems of real people, Microsoft stands the best chance of success.
Tuesday, April 25, 2006

Podcast episode #4: news analysis and some insight from the field

We've just posted episode four of the pretty-much-weekly MWD podcast.

Neil M hosts this episode with Neil W-D (Jon is in San Diego at the Microsoft Management Summit). The format is a little different this week: the two of us ponder recent industry news and Neil M's discussion with enterprise architect (and fellow blogger) James McGovern.

If you want to know what we think about Scott McNealy's resignation; context-based authentication; and how general ledger / chart-of-accounts influence IT architecture then you can listen in here.

You can subscribe to the podcast feed here.

Feedback and questions are welcome, as always.
Monday, April 24, 2006

More ecosystem building from VMware

Following on from my previous post regarding VMware's attempts to foster a healthy ecosystem around its virtualisation offerings, the company's at it again. According to CNET (I can't find a press release on the VMware site), VMware has formed the Virtual Desktop Infrastructure Alliance, which includes Citrix, ClearCube Technology, IBM, Hewlett-Packard, Wyse Technology, Sun Microsystems, Altiris and Softricity. What links all of these vendors is the notion of delivering a user's desktop, and in the case of the latter pair, the applications they use, from the server: the server-managed desktop.

So, whether the desktop resides on the user's client device or on a remote server (most likely a blade server which is where ClearCube, IBM and HP fit in) and delivered via a thin client (that's Citrix, Wyse and Sun), VMware provides the desktop virtualisation.

UPDATE: The press release has now appeared. Intriguing how the press is able to cover a story the day before it is released to them ;-)
Tuesday, April 18, 2006

Podcast episode #3: ITSM, ITIL - where's the value?

We've just posted the third episode of the pretty-much-weekly MWD podcast.

Jon Collins hosts this episode, and guides a three-way discussion on his current research focus - IT service management.

You'll learn answers to two crucial questions: why doesn't it make sense to by an ITSM product? And what has ITIL got to do with Dr. Who?

The podcast audio is here; you can subscribe to the podcast feed here.

Feedback and questions are welcome, as always.
Thursday, April 13, 2006

Liberty must focus on user privacy and experience

This starts where my earlier discussion of the Liberty Alliance Project's approach to user-centric identity left off - with a discussion of some of the important user-centric issues that Liberty can ill-afford to ignore.

Mechanisms need to be in place to ensure that identity providers and service providers aren't able to build up pictures of an individuals activites, and so potentially compromise privacy. The Liberty white paper discusses some workarounds but further work needs to be done.

Also, Liberty must extend its focus beyond backend protocols and recognise the importance of a consistent user experience. Without such consistency an individual is likely to be confused as they interact with different combinations of identity and service providers. I am not necessarily suggesting that Liberty define a single user interface but rather that there is consistency in the dialogue, the use of interface cues etc. This is one advantage of Microsoft's InfoCard approach: an easy-to-understand credit card metaphor with a common user experience.

This was acknowledged by yesterday's presenters and Liberty does have some guidelines already, such as the ID-WSF Interaction Service, but more work is required. One possible avenue to be explored is collaboration with the Higgins Project, given that it is focussed on standardising how developers exploit different identity management solutions. The big challenge here of course (as I discussed here) is that Higgins is an Eclipse project and Sun, which remains wedded to its NetBeans alternative to Eclipse, is a driving force behind Liberty. Concidentally, Paul Trevithick, CEO of Parity Communications and the project lead of Higgins, has been seeking input from the Identity Gang's Identity Workshop mailing list on one aspect of the user experience: consistent, meaningful naming of "information card thingies".

Clearly, it is still early days but organisations who deliver Internet-based services to the public at large need to be closely monitoring developments around user-centric identity. Going forward, individuals are going to demand simpler, consistent mechanisms for securely accessing those services, where they are firmly in control, and which do not compromise privacy.

Liberty, LECPs and user-centric identity

Yesterday I attended a Liberty Alliance Project webinar on user-centric identity (see here for some press coverage and, no, although I asked the question about user experience referred to in the piece, I am definitely NOT a developer working with Microsoft's InfoCard).

Some of you reading this are perhaps thinking "Must have been a short webinar - isn't Liberty an enterprise-centric play". Until about 3 weeks ago, I would have agreed with you, as I discuss in our report on identity management:

Current models for identity federation fall in to one of three types:

bi-partisan, involving federation of identities between two service providers

hub-and-spoke, involving federation between a dominant service provider and a number of its partners

a “circle of trust” involving federation between multiple service providers operating as peers.

These solutions are typically based on dedicated technologies and/or products implemented in a point-to-point fashion with the federation, rather than the individual, at the centre. As a result, it is highly likely that the individual will have to possess a different digital identity for each federation they become involved in and furthermore will be subjected to multiple, potentially inconsistent, experiences.

Identity management technologies and standards need to move to support a federated design, which is network-based (a “super” circle of trust) where today’s federated identity solutions are themselves federated

I began to change my mind, however, when I read Liberty's Personal Identity white paper, published at the end of March. At the time I felt, and this was confirmed by John Kemp of Nokia during the webinar, that it had dawned on Liberty that user-centric identity had become a hot issue, not least because of the work of Microsoft's Kim Cameron with the identity metasystem and InfoCard and the recent buzz of interest around the Higgins project).

In a nutshell, the Liberty paper explains how identity selector capabilities, which allow an individual to control how their identity data is passed between identity providers and service providers, can be implemented using the Liberty specifications. These capabilities depend on part of the Liberty ID-FF web-based, federated single sign-on specifications, known as the Liberty-Enabled Client or Proxy Profile (LECP).

Without LECP, when an individual makes a request to access a service which requires authentication by a different identity provider, their browser is automatically redirected from the service provider's site to the identity provider without user intervention i.e. the user is not in control.

LECP acts as intermediary, allowing the user to control which identity provider is accessed. In addition, LECP can be used to implement identity services, based on Liberty's ID-WSF web services-based specifications (authentication, discovery, personal profile), as user agents running locally e.g. in a web browser on a mobile device. The webinar included some live demonstrations of the LECP in action, which you can get some idea of from Hubert Le Van Gong's (the other webinar presenter) blog. This is definitely a move in the right direction by Liberty and holds out some hope for my "super" circle of trust. However, there are some important issues still to be addressed (to be continued ...)
Wednesday, April 12, 2006

Introducing the Uncompany

As I mentioned earlier, the research and analysis project that's taking up a lot of my time at the moment is focused on Web 2.0 and the enterprise.

I've been scratching my head for some days now, trying to think about how (or even if) it's possible to analyse the business impact of Web 2.0 in a structured way. "Web 2.0" is very hard to distil as a concept - because in reality it's a label for a period in time (much like the Cambrian era - in more ways than one). We end up with silly "definitions" which can only point to things and say "that's Web 2.0," and "this isn't Web 2.0". That's not exactly going to help a company figure out if they should do anything about this Web 2.0 stuff; and if so, what they should do.

As well as talking to enterprises and interviewing a large number of vendors, I've re-read all the usual texts - you know, the ones that litter the conversations of the blogerati: The Cluetrain Manifesto; The Tipping Point; The Wisdom of Crowds; etc etc. The books were interesting, but I was still missing something (maybe I'm just a bit stupid).

Then I read a paper by John Hagel and John Seely Brown - From Push to Pull: Emerging models for mobilizing resources. It's pretty theoretical in places, and it's a work-in-progress; but I suddenly realised that I'd fallen into the age-old IT industry analyst's trap: I was starting with the technology. I was thinking that what I needed to do in this project, was to map how technology changes in the Web domain will affect what IT needs to do to support business needs.

But the reality is that there's an equal degree of change in business, towards a post-industrial model that recognises that the old rules don't necessarily hold - and this change maps closely to what's happening on the web.

What's much more profound is that trends in the world of the Web and in the world of business are driving each other - and this feedback loop is going to have a profound effect not only on enterprise IT and how it meets business needs in the future; but also on how businesses will look.

Hence, thanks to JH and JSB, the title of this post: Introducing the Uncompany*: the future of business, shaped by the forces of globalization, the drive for transparency and the desire to engage effectively with smart, connected markets.

What does the Uncompany look like? What will the impact be on how technology is used in business? What impacts will this have on IT suppliers, buyers, architects, and operations people?

I'm working on it.

*If you're wondering why I'm using this term, see here.
Monday, April 10, 2006

On MWD FM this week...

Our second podcast episode is now live.

This week's episode is with all three MWD analysts - Neil Ward-Dutton (your host), Neil Macehiter and Jon Collins. Welcome Jon!

In this episode, you'll hear why project portfolio management isn't sexy; why it's so important for IT industry analysts to keep working with real people doing real jobs; and when SOA will hurt the big packaged enterprise application vendors. Lastly, you'll hear Jon Collins invent a new term - Stick Oriented Architecture...

Don't miss it. (You can subscribe to the feed here).

Like we said before: if you're a MWD subscriber (which, don't forget, costs nothing) we'd love to hear of questions that you might like us to answer. We're quite happy to sit back and shoot the breeze on the topics that we think are interesting - but it'll be even more interesting if we can also tackle the topics that *you* think are interesting.
Thursday, April 06, 2006

Plugging an identity-related compliance hole

I just got off the phone (off the Skype Out doesn't have the same ring to it ;-) ) from a briefing with Cyber-Ark Software . The company has been around since 1999, with headquarters in Massachusetts - the founder actually originates from Israel - and has received $23M in funding. It has more than 200 customers, including the likes of ABN Amro, citigroup, ING, Lehman Brothers, Pfizer and Wells Fargo.

Cyber-Ark's business is based around what it refers to as secure Vaulting Technology, for the storage and exchange of sensitive information. What on earth has that got to do with compliance holes? I posed myself the same question at the beginning of the call. In fact it appears from my discussion that the link to identity and compliance is only something that Cyber-Ark has made in the last 12-18 months as more and more companies have come under the scrutiny of auditors as result of regulatory compliance. The link between identity and compliance is nothing new (and something I discuss in our recent identity management report) so what's the story?

At its heart it relates to privileged accounts: Unix root; Oracle's inbuilt system and sys accounts (as a former Oracle DBA it bought back memories of customers looking aghast as I sat at the SQL Plus prompt and informed them that I was in a position to delete their entire database even though they hadn't give me the passwords); Cisco's IOS enable; accounts embedded in batch and admin scripts and so forth.

These accounts pose, quite rightly, a problem for auditors. By virtue of the privileged status, users of those accounts have significant power to potentially bypass audit controls. This makes it difficult to prove - and that's what auditors want - that power has not been abused. This is where the Secure Vaulting technology comes in. The privileged account details are treated as sensitive information, stored in an encrypted form in an Enterprise Password Vault. If an administrator requires privileged access, they first have to go via the vault which maintains an audit trail of all such accesses. Obviously, it can't control what's done once logged on as an administrator and so has to be implemented as part of a broader auditable security framework, with access control and audit. But it ensures accountability - and also periodically regenerates passwords. It also overcomes the challenge of the forgotten administrator password scenario (I remember being shown a hack on a SunOS 4 workstation to recover the root password 30 minutes before a critical product demonstration where rebuilding the system wasn't an option!); the DBA's been run over by the proverbial bus problem; and disaster recovery situations.

This is not something that is acknowledged, at least in my research, by current identity management players, and it's therefore no surprise that Cyber-Ark has established partnerships with the likes of IBM with Tivoli Identity Manager.

Although Cyber-Ark seems to address a very real issue, it seems to me that the company faces a couple of challenges. First, there's the fact that they don't fit well with into the product/technology buckets of the big analyst firms (but perhaps should have a place in Compliance Oriented Architectures - hint to our friends at RedMonk). Second, and more significant, is that the compliance risks associated with privileged user accounts are not well understood - until after an audit.

Whether or not organisations need an off-the-shelf solution to this problem, it seems to me that Cyber-Ark raises an important issue that can not be ignored in planning for compliance.
Wednesday, April 05, 2006

What is Web 2.0?... lucky I wasn't drinking

I'm currently waist-deep in a research programme on the potential impact of "Web 2.0" ideas on enterprises. Really fascinating stuff that should start to see the light of day in a MWD perspectives report in the next couple of months.

Apologies to those of you to whom this is old hat, but just now I ran across this graphic when reading the results of a reader survey conducted by the Register in November shortly after the O'Reilly Web 2.0 conference:


[Reproduced from Register website; copyright is theirs]

Thank goodness I wasn't drinking a coffee at the time - it would have come out of my nose. I haven't seen anything that funny in a long time. Which probably says I should get out more.

I guess if there's any kind of serious point I can take from this, it's that denizens of the US West Coast startup community looking to European (and UK in particular) businesses as prospects for their wares are going to have to work pretty damn hard ;-)
Tuesday, April 04, 2006

Writing the rules of regulation

Hurrah! I’ve just been to a Cisco press event, and for the very first time I have seen a vendor presenting on “how to cope with the vagaries of regulation” rather than “companies need to implement compliance technologies”. I have frequently argued in the past that compliance was a post-Enron US invention, jumped upon by our transatlantic friends as an opportunity to sell data management software. For European businesses, the problem was not whether “compliance” was a good thing per se, but rather, the attitude that it was something new – we have been coping with the shifting sands of regulation for tens, if not hundreds (or maybe thousands) of years. Compliance has been treated by vendors as a stick rather than a carrot, but it looks like this is changing, at least at a high level.

All the same, “coping with the vagaries of regulation” remains a major challenge. Consider, for example, the UK data protection act, in force for nearly 10 years now. The DPA requires that companies implement appropriate protections on their customers’ personal data. However, apart from by conducting a full, manually controlled audit, it is almost impossible to prove whether an organisation is meeting its DPA requirements or not. It becomes even more complicated when we consider that most, if not all companies are faced with a stack of other data-oriented regulations. Some of these have conflicting requirements (for example in terms of data retention), most are subject to interpretation at one level or another, and all are open to abuse.

Even given the regulation-oriented approach, I don’t believe that companies will be out of the woods any time soon. It all comes down to the management of risks, and the implementation of rules. Briefly, risk management requires companies to understand the breadth of regulation, as compared to the width of potential outcomes should things go wrong. The inevitable trade-offs result in the definition of business policy, in terms of process-related business rules as well as in terms of continuity planning and security management. Even if done on paper, to achieve a comprehensive set of rules would already be a major achievement for any organisation; the challenge is then to apply such rules to the IT environment. Today, while there are tools for isolated rules management in specific areas, there does not exist a single tool that can straightforwardly implement and manage the ensemble of such rules in a live data centre. And even if there were such a tool, regulations continue to evolve and any implementation would become very quickly out of date.

This should not be seen as an excuse for inaction. It is possible for companies today to define rule sets that they believe demonstrate how they meet the set of regulations that apply to them. It is also possible for IT departments to demonstrate how they manage applications and data in a way that meets the rules, based on an appropriate use of point products - as listed by our esteemed brethren at Redmonk. This approach may be siloed by application or system, but at least it’s a start, and it prepares the way for more comprehensive (and as discussed by Redmonk, better architected) frameworks for rules automation, should such tools exist at some point in the future. For the latter we can only hope that the wave of acquisition activity on both sides of the pond (as indicated by SAP's purchase of Virsa Systems) will result in integrated solutions for companies of all sizes, to solve these age-old problems.

The virtualisation battle moves into the next phase

VMware announced yesterday that it is making its virtual machine file format freely available, with no license or royalties. Despite also claiming that the company
is committed to supporting any other open virtual machine disk formats broadly adopted by customers and working toward converging on open standards in this area
this announcement is clearly motivated by the desire to ensure that VMware's virtualisation technologies are at the hub of a healthy ecosystem of third parties offering a range of value-added capabilities, such as backup/recovery, provisioning, performance optimistation and so forth. To emphasise the point, VMware had Akimbi Systems, Altiris, BMC Software, PlateSpin, rPath, Surgient, Symantec and Trend Micro on hand to explain how they intend to use the file format. VMware first made its intentions clear in this regard, with the announcement of the VMware Virtual Infrastructure SDK in June 2004.

VMware is not alone. At the Microsoft Management Summit in April last year (see our report Microsoft bids for role as enterprise management player), Steve Ballmer announced royalty-free licensing of Microsoft's equivalent Virtual Hard Disk (VHD) format. As of yesterday, Microsoft has signed up 45 licensees, including XenSource which is leading the open source community behind the Xen hypervisor technology, as well as offering a range of value-added solutions based on Xen.

Yesterday, Microsoft also announced that Virtual Server 2005 R2 is now available as a no-charge download. This announcement, like VMware's release of the free VMware Player (in December 2005) and VMware Server (in February this year), is indicative that the core virtualisation engine is becoming a commodity - as of course is Microsoft's intention to include a hypervisor in the Windows operating system.

It is this commoditisation which is at the heart of these moves by VMware and Microsoft. The virtualisation battle is not going to won on the basis of who is armed with the best engine, particularly in the face of open source alternatives such as Xen which the likes of RedHat and Novell are building into their Linux distributions. The spoils will go to the vendor whose engine can be harnessed most effectively. VMware clearly recognises this and is investing heavily in technologies such as VirtualCenter. It's not lost on Microsoft either but it still has some way to go to catch up.

But both vendors clearly recognise, as the opening up of their respective file formats indicates, that they can't do it alone. In particular, they need to ensure that the leading management players - BMC, CA, HP and IBM - are on board since that's who many of their target customers will be turning to for a lead. VMware has done an excellent job of cultivating partnerships with these players but none of them can afford to ignore Microsoft.

These moves are good news for enterprises. Not only does it increase competition and, as a result, choice: it also drives innovation in the management areas required to maximise the potential benefits of virtualisation technology. In the absence of industry-wide open standards, enterprises will rely on these management solutions to abstract the underlying virtualisation engine.
Monday, April 03, 2006

We're podcasting

As some of you may know we have already participated in a couple of podcasts on SOA in 2005 and business process management with Dana Gardner of Interabor Solutions. Now it's our turn to take control of the microphone.

Our first foray into the podcastosphere (?) can be accessed here (or you can subscribe to the feed) and features a discussion between the two Neil's, where we consider a number of questions:

What does Microsoft have to do with polyester shirts, and why is this important in the context of its strategy and the idea of Web 2.0?

How does the idea of user-centric identity relate to what the Liberty Alliance is doing?

What *is* Infocard, exactly?

It would be great to get your comments. If you are an existing MWD subscriber (and if you're not you can become one at no cost here) we would welcome your suggestions for topics you would like us to cover in future podcasts or specific questions you would like us to answer.

Vista delays give time to reflect

There's been plenty of kerfuffle about the delays in Windows Vista, for better for worse, and there is little to add to the debate - Grady Booch has an evenhanded summary of proceedings here. Corporate SA/volume license agreement customers will still be getting a pre-holiday season version of Vista but Microsoft might have been better delaying that as well based on our research. According to this report produced in partnership with Freeform Dynamics, in that we're not sure that any delays will make a great deal of difference to business customers, who show little inclination to move to Vista any time soon. Perhaps Microsoft should take the opportunity, and the time, to identify exactly why businesses would want to undertake an extensive upgrade of their desktops, and exactly how the benefits of Vista would outweigh the costs of the migration.


Burn this feed
Burn this feed!

Creative Commons License
This work is licensed under a Creative Commons License.

Blog home

Previous posts

Normal service will be resumed shortly
Links for 2009-07-02 [del.icio.us]
Seven elements of Cloud value: public vs private
The seven elements of Cloud computing's value
Links for 2009-06-09 [del.icio.us]
Links for 2009-06-02 [del.icio.us]
Links for 2009-05-27 [del.icio.us]
Links for 2009-05-20 [del.icio.us]
Micro Focus gobbles Borland, Compuware assets
Links for 2009-05-05 [del.icio.us]

Blog archive

March 2005
April 2005
May 2005
June 2005
July 2005
August 2005
September 2005
October 2005
November 2005
December 2005
January 2006
February 2006
March 2006
April 2006
May 2006
June 2006
July 2006
August 2006
September 2006
October 2006
November 2006
December 2006
January 2007
February 2007
March 2007
April 2007
May 2007
June 2007
July 2007
August 2007
September 2007
October 2007
November 2007
December 2007
January 2008
February 2008
March 2008
April 2008
May 2008
June 2008
July 2008
August 2008
September 2008
October 2008
November 2008
December 2008
January 2009
February 2009
March 2009
April 2009
May 2009
June 2009
July 2009

Blogroll

Andrew McAfee
Andy Updegrove
Bob Sutor
Dare Obasanjo
Dave Orchard
Digital Identity
Don Box
Fred Chong's WebBlog
Inside Architecture
Irving Wladawsky-Berger
James Governor
Jon Udell
Kim Cameron
Nicholas Carr
Planet Identity
Radovan Janecek
Sandy Kemsley
Service Architecture - SOA
Todd Biske: Outside the Box

Powered by Blogger

Weblog Commenting and Trackback by HaloScan.com

Enter your email address to subscribe to updates:

Delivered by FeedBurner