Correction: InfoCard federates

Johannes Ernst, whom I linked from my piece on InfoCard last week, wrote in to point out that I erred in my quick description of the service. He says that in InfoCard:

…the PC does not actually store the identity information, only pointers to it. The actual identity information is stored by identity providers, who are the “3rd party” in the system (the other ones being the relying party, such as a website, and the PC component).

This makes InfoCard much less like Apple’s Keychain (or for that matter the existing Windows saved password feature) and more like, well, a federated identity system. Interestingly, this is consistent with what I remember from the discussion of the future of Passport back in 2001 with MSN execs.

Good wines of Virginia, oxymoronic no longer

As a newly minted oenophile traveling around my home state in the mid-nineties, I discovered two things:

  1. Virginia had wineries, many tucked into scenic ruins like the Jeffersonian house at Barboursville.
  2. Many of them made wine that only a mother could love.

I was always a fan of Barboursville, but felt the winery did best with its blends and with the lesser-known sweet grapes (Malvasia, for instance), and had a ways to go on its core reds (though its Sangiovese and Cabernet Franc held promise). Other wineries were equally variable: Lisa and I enjoyed our first date at Naked Mountain, but found its chardonnay undrinkable a year or two later as our palates matured. And often the best thing that could be said about tasting wines from other vineyards in the annual Virginia Wine Festival was that it got you out of doors.

An article in Friday’s New York Times suggests that the wines are significantly improving. Might be worth visiting again soon, particularly now that the shipping laws are (might be?) changing.

Getting things done: Tiger Mail

I didn’t have much chance to do anything with Tiger last week while I was on the road, but this morning I finally started playing with Smart Mailboxes in Mail, which is one of the features I most eagerly anticipated for this upgrade. And it is fantastic, even with just one or two smart mailboxes created.

Originally I had anticipated replacing some of my 150+ mail rules (I have a hierarchical mail folder structure that takes a lot of care and feeding) with smart mailboxes. While I may still investigate doing that, I found that the first smart mailbox I implemented is probably the most useful one I’ll create: Unread Mail. The mailbox has a single condition: collect all unread mail messages. This is great for me because of all the mail rules I’ve implemented, which spread a typical day’s mail across a bunch of different mailboxes. That’s generally a good thing for scoping messages for later retrieval, but less good if I just want to read my 30 or so new mail messages at one sitting without changing context between ten different folders. The Unread Mail smart folder allows me to just read all the mail without worrying about filing it, because it’s already filed. I used a system like this on Outlook when I was running Office 2003 at Microsoft, and that folder plus one for flagged mail completely revolutionized my workflow.

I’ll be playing around with some more smart mailboxes in days to come, including one for recent messages (everything sent or received within the last week). It’s nice to have some tools that actually improve my productivity.

Roadside Tut

roadside Tut

Spotted a little after 8:30 driving south on 128: a car carrier with an Egyptianate gold-plated longboat and two giant King Tut heads.

Really.

After I passed it the first time, I got stuck in a non-moving line of cars waiting to exit, and was able to grab a photo (albeit blurry).

Man. I knew traffic in Massachusetts was weird, but I had no idea it was this weird.

Here’s hoping we’ll meet now and then

Howard Anderson at Technology Review: Good-Bye to Venture Capital. Howard, who was one of my professors at Sloan and who cofounded YankeeTek Ventures and Battery Ventures, says he’s getting out of the VC business because “technology supply is bloated” (i.e. there’s more technology available than people can buy); “the hype machine is broken” (i.e. executives are no longer spending money on tech like their hair was on fire); and “the financial markets for technology companies are no longer exuberantly irrational.” So Howard is getting out of the business and won’t be raising any more funds.

Part of this, I know, is Howard being Howard—controversial and blunt-spoken. But how much of his analysis is on? Is there really no way to make the big returns any more? Or, as an anonymous colleague of mine puts it, is he taking his marbles and going home because it didn’t work out for him?

(Update: the comments on this BusinessWeek article and from A VC and Brad Feld suggest that it might be the latter.)

Google google google, google google google.

Three Google items:

  1. The new personalized Google interface has arrived, prompting cries of “Oh God, it’s a portal.” Good: the default interface is still clean and uncluttered. Bad: for now.
  2. Google Desktop went out of beta.
  3. Google AdSense for RSS feeds launches (see fellow Sloan class of 2002 grad Shuman Ghosemajumder’s post on the Google Blog for other links). I’m really of a mixed mind on this; on the one hand, bully for all those small publishers whose sites are supported by AdSense revenue, because they just got a way to make their RSS feeds pay for themselves. On the other, AH GOD GET THE ADS OUT OF MY RSS AHHHHH!

Geez. Even writing this post, I’m feeling a little too much like John Cusack. Malkovich malkovich? Google google google.

ITxpo: Keynote with Eric Schmidt of Google

The keynote interview with Dr. Eric Schmidt, CEO of Google, was as interesting for what he didn’t reveal as what he said. Eric has a bracingly dry sense of humor about his business and the industry, but he is deadly serious about the company and its responsibilities and challenges. He is also skilled at the art of offering insightful answers that do not directly answer the questions he is asked, so be forewarned as you read my notes. (Unless noted, everything below that is not a question is a paraphrase or direct quote of Eric.)

Q:Has the Internet leveled the playing field such that the dominant players in the industry can’t exercise their power? A:You know, “dominant” has a specific legal meaning…

There have been a lot of changes in the industry over the last 10-15 years. Back then, email was something you had to get in the car and drive to the office to do. I’d drop my daughter off, go to the office, do email, finish email and then hop in the car and go home…

(On how Google solves the performance and responsiveness issues that other companies might face online:) The problems are easier for Google because our data tends to be more static and lends itself to being replicated, as opposed to transaction based data that changes frequently.

(On Google’s internal systems🙂 When I got to Google we were going to build our own financial system because we were frustrated with the limitations of Intuit’s Quickbooks and its five-user license. Seriously. So I said that isn’t going to work exceptionally well when it comes to Sarbanes Oxley and auditing. We implemented Oracle Financials and put it in a box, and said “we aren’t going to change this.” Then we built a system around it to manage the business’s contact with it and we change that frequently.

Our focus is on personalization and comprehensiveness. For instance, you can see Google internet search results, your intranet (with the Google search appliance), and your hard drive all in one results list. Of course that brings in the issue of privacy, and that is where the “don’t be evil” corporate culture comes in. …Anyone can pull the ripcord and say “that’s evil” from an end user perspective and stop the train. [And no, Dave, I didn’t get a chance to ask him about autolinking, and he didn’t volunteer an opinion.]

We’re not in the information technology business, we’re in the information business. We have only digitized a very small percentage of all available content. There is a lot of room in the market.

(On advertising🙂 We don’t run the business as an advertising business. We run it based on end user satisfaction. If we keep our users satisfied, we keep our ad inventory up, which keeps advertisers happy. Q: You’ve taken away ad revenue from magazines and other traditional content players…. A: I prefer to think we’ve grown the market. There is a growing shift to more contextual ads and we are playing in that market. Q: You also seem to be important to a very large base of very small companies…. AIt is scary to understand that you are fundamentally in the revenue chain of a small business. We disseminate that information across the company and use it in planning products.

Q: You hired the lead developer of Firefox. Are you going to build a browser? A: We decided a long time ago that we would pursue a browser independence strategy, so that our services would work well on all browsers. These people, like the one you mentioned, are working on that, and do important work with the open source community as well.

Respecting your customers

Catching up on non-ITxpo related topics this morning, two things caught my eye. First, my delayed reaction to the announcement that the New York Times will be putting some of its content, notably op-ed columns, behind a for-pay wall starting in September. This is of course brilliant because the Times’s editorial opposite number, the Wall Street Journal, has its constellation of right wing editorial columnists available for free. So now there will be even less of an opposing voice online. What’s most depressing as a user and reader of the Times is that this move comes after a history of reader-and-blogger-friendly decisions, including RSS support. So long, NYT, we’ll miss you. Is there an editorial forum out there that wants to stay on the record, and stay in the conversation? (For straight news, the BBC is looking better all the time.)

Second, the announcement from Microsoft about their new ID infrastructure, InfoCard. On the surface, the announcement sounds a lot like Apple’s Keychain; a local system solution to hold identity information such as login names, passports, and certificates. The difference is that InfoCard, like its failed Passport predecessor, can also hold credit card information. The shift in Microsoft’s identity management strategy, from central control to user application, represents a clear victory for Microsoft’s customers, and may be a pretty good indication that Microsoft is doing a better job of listening than it was four years ago. (More information about InfoCard, including a description of the user experience and some underlying technology notes, courtesy Johannes Ernst.)

Connection? Your customers will be the people who tell you whether your new business plans will succeed or fail. Learning to listen to them is an essential skill that must be mastered if you are to compete.

—Which gets me nice and warmed up for the final session I’ll attend at ITxpo, Are Your Customers and Users Revolting?, where three Gartner analysts will discuss customer collaboration and communication technologies and the implications for enterprises. I’m going to see if I can arrange some sort of connectivity in the room so I can blog the session, but otherwise I’ll take notes and post later.

ITxpo: Cisco and IBM

Charles Giancarlo and Steve Mills are on a panel with two Gartner officers. Giancarlo defines “complexity” as anything that causes customers headaches in implementation. Customers want simplicity, by which they mean that solutions are thoroughly tested for their environment, not fewer features. He points out that we simplify the mundane (networking protocols) and then build greater complexity atop the newly simplified stack. He also correctly points out that the desire for differentiation is a big source of complexity.

Mills says that complexity in software is a reflection of the desire for more autonomy from central IT control, and stems in some respect from the decentralization of IT infrastructure starting with the shift to minicomputers in the 1970s. He also says that software will continue to get more complex: “software developers, given enough resources in time, will reinvent the work of everyone who’s come before them.” He suggests that encouraging reuse and adoption of open source can help to simplify the stack by not encouraging the development of new code, and that bloat is primarily caused by a cultural issue among software programmers. Giancarlo agrees: if improvement in software code doesn’t directly drive customer benefit, then it isn’t worth doing.

Mills says that defeating the developer’s cultural tendency toward re-creating the wheel requires: time-boxing, or defining short-term incremental projects; starving the team for resources; and embracing user-centered design.

(My battery is dying; more to come.)

Later: In discussing how to improve reliability in the face of proliferating software versions, Mills talked about reducing redundancy in code and reproducing known configurations and feature paths in the test environment. Giancarlo talked about lessons learned from integrating Linksys’s consumer business in creating a balance between complexity and functionality.

Causes: infrastructure is cumulative (earlier today someone said applications never die; the consensus appears to be that retiring outdated IT offerings is almost impossible). One thing that might help is, following Drucker, to continue to push good ideas down into silicon so that they move lower in the stack. Unfortunately, says Giancarlo, complexity grows organically: today’s skunkworks project is tomorrow’s killer competitive advantage, so it adds to the complexity. The wrong thing to do is to try to remove complexity from new products; rather, look at them when they provide comparatively less value and try to remove complexity then. Also, the multiplicity of architectures can be a political issue.

Why to reduce complexity through SOA: Mills: money is lost in the cracks between all the handoffs between different legacy systems. You need to add some IT complexity in end-to-end monitoring to enable the business to reclaim the money lost in its patchwork of pre-SOA systems. Governance is one of the most important solutions for reducing complexity.

ITxpo side note: blogging

feedster search results

One thing I find interesting is the small number of bloggers in attendance at the conference. I’m currently sitting next to the only other blogger on the conference blogroll, Boris Pevzner of Centrata (and of MIT Course 6 mid-90s). Feedster and Technorati don’t turn up many hits for ITxpo; in fact, Feedster notes that this site is the biggest contributor and helpfully offers to scope the search relative to this site, which is a little scary. Likewise, most of the hits in Technorati are follow-ups to press releases or announcements made at the symposium. Is there so little of value that happens at the ITxpo that no one has thought to do it before, or is it just that the sort of organizations that have embraced blogging are under-represented among the attendees?

Certainly the logistics have a way to go before the conference becomes truly blog-friendly, with none of the socratic dialog (or good WiFi) that characterize the best of the unconferences I’ve attended. But the presence of the conference blog is a nice first step.

ITxpo: Real Time Enterprises

Ken McGee is speaking about real time enterprises. He claims that with real time enterprises, which represents IT moving beyond its traditional boundaries into adding real value through real time monitoring and modeling of events in the company, that business uncertainty becomes “unnecessary and unavoidable.” The talk is an expansion on Ken’s book, Heads Up. The idea is to monitor, capture, and analyze root causes and overt events and use them to make near real time decisions.

One way to leverage the benefits of real time decision making, McGee suggests, is to publish financial information more frequently, which not only mitigates compliance risks (a la Sarbanes Oxley) but also has the side benefit of attracting investors eager to get more frequently updated information about the performance of their portfolio. Another is to investigate dynamic pricing. Both of these are predicated on taking known IT capabilities to the next level. In the case of dynamic pricing, this includes supply chain management, sales, electronic ink (for retail price display), wireless, and future demand prediction.

McGee also discusses decision criteria for what should be monitored in real time: only choose the information that, upon receiving it, would make a decision maker change her course of action (this does not include most “dashboard” information), and where such a decision would have a positive effect on the top ten revenue-generating (or cost) business processes. This turns out to yield a very small number of real time factors.

But he also says that IT people are going to be the most likely to block real time data gathering efforts. He doesn’t dive into this deeply enough, in my opinion. This is the area that might really yield some insight into the dynamic between decision making and IT.

ITxpo: Service desk best practices and methodologies

This morning’s first session for me, Excellence in IT Service and Support, was a review of best practices in service desk management. A few interesting data points came out in the context of the talk, including informal survey results (of Gartner Data Center conference attendees) indicating that about 62% of respondents intend to adopt ITIL, either alone or with other methodologies; the number for ITIL alone was 31%.

This was interesting to me because the further in ITIL one gets away from service desk and incident and problem management processes, the more the coverage of the standard starts to overlap with other well-defined process libraries. For instance, processes in ITIL, including release management and change management, dealing with internally developed applications have a natural overlap with the Capability Maturity Model Integration (CMMI), which has software development as its focus. Organizations with internally developed applications need to consider not only which portions of ITIL, but also which portions of other methodologies, they may need to adopt as they carry out process improvements.

ITxpo: IBM announcement press

Some follow up links on IBM/Tivoli’s ITSM announcements yesterday.

CNET: IBM’s Tivoli tackles IT processes: “The majority of application failures are due to changes that get introduced to a working system, said Bob Madey, vice president of strategy and business development for Tivoli.”

ComputerWorld: IBM unveils Tivoli systems management software. “The idea of having a centralized database for tracking IT assets in an organization arises from long-standing recommendations by the Information Technology Infrastructure Library (ITIL) and other systems management groups. BMC has already announced a centralized database, but IBM believes a federated approach makes more sense because many companies have infrastructure databases already, Madey said.”

The MarketWire release talks about the expansion of IBM’s Open Process Automation Library aka Orchestration and Provisioning Automation Library(OPAL) to IT service management, a point I failed to capture yesterday because I need to know more about it before I fully understand the implications.

Follow up: iChat issues in Tiger

Yesterday’s update to Tiger does not address the iChat issues that many newly upgraded users are having, and this new support article, “iChat AV 3.0: ‘Insufficient bandwidth’ messages,” indicates why. The article suggests that iChat’s newly added QoS features are (irony alert) breaking the iChat experience for many users, since the DSCP used to implement the feature is blocked by some ISPs.

Question: what do you call a change to an application that breaks existing functionality for many users? Where I work, we call that a regression bug, not a feature.