The Culture of Import

The only way we’ll see everybody in the enterprise using ECM is if ECM becomes everybody’s desktop. Documentum understood this from the beginning; they wanted to control documents from creation to destruction and head off trouble at the docbase’s borders: client-side caching, import/export, inter-document linking (OLE, HTML, XML), etcetera. Desktop Client even tried to look like and live within the operating system to hide the gaps, but some bad architectural choices and the irrational exuberance over web applications doomed it.

The Culture of Import needs to change before ECM can become truly pervasive. I’m still surprised by how many long-time Documentum users still create documents or spreadsheets on their local disk drives, only importing them into a docbase when the border guards catch them. It’s an old problem that persists even after this latest panic over records management–not just because we’re used to it but because our tools reinforce the idea. However, some new technologies might help change people’s minds.

Adobe Air and Google Gears may have their part to play by blurring the here/there distinction between “My Computer” and “My Network”. Google Apps comes closer to total document control than Documentum ever has since people create, edit, embed, and share documents completely within Google Apps and Google Sites. I can’t say for sure if they understood what they were doing from an ECM perspective, but Google isn’t a stranger to paradigm shifts as Google Mail’s use of tagging and conversations demonstrates.

Unfortunately Microsoft’s dominance of the desktop and Office applications means we won’t see the necessary culture shift until they get it. Hopefully their building excitement around cloud computing and software as a service will help the software giant overcome its inertia and shed its file/folder mindset. Then you’ll only see Sharepoints instead of disk drives and share areas when choosing “File > Save as” from Word’s menus.

Ruby on Rails — Get Excited!

I had no idea Ruby on Rails was so cool. The language looks a little odd, but it’s decipherable. It’s the templating and rapid development that really caught my eye in the “15 minute blog” demo. If you don’t know RoR, watch it and try to tell me you don’t get a little excited.

I’d been thinking of using my Resume-o-matic as an excuse to play with Hibernate–and Spring to a lesser extent–but it looks like a capital-P Project to get started. If RoR’s got applicable templates, I could do a little diagramming and fast prototype it from there. Mmm, tasty.

Of course I’m already excited about (or at least interested in) a bunch of other things like Google Sites and Groovy and Python 2.5; they all popped up on my radar last week. It’s technological feast or famine as usual.

The Duplicate Folders Mystery, Part II

The Folder Is a LieI was staring at the impossible in early 2004: Two folders named “Reports” in the same place. That’s just not supposed to happen. (Refresh your memory on why with Part I.) After some idql and iapi snooping, here were the facts:

  1. Both folders did indeed have identical object names without leading or trailing whitespace.
  2. They did NOT have any values in common between their i_folder_ids, so they weren’t really in the same place. But Desktop Client though otherwise.
  3. One had r_folder_path values with trailing whitespace in one of its containing folders.
    e.g. “/cabinet/folder1 /folder2/folder3” — Hard to eyeball the space after “folder1” unless you’re looking for it!

After that, I couldn’t reproduce the problem. This docbase was well known for having “issues”, so nobody (including Documentum) was surprised that something impossible and inexplicable was happening. I mentally filed it under “Documentum moving in mysterious ways”, gave the business administrators a tedious but effective work-around, and got back to more urgent concerns.

The next year I ran into this problem again from the engineering side instead of the business unit. As a particular project wrapped up or budget ran out, the company passed me around to different groups. I’d worked in engineering back in 2001, and my knowledge of the company’s custom environment made letting me go difficult.

I ended up helping out with the 4.2.8/5.2.5 upgrades by developing a triage procedure and fixing some of the more dire problems. Ever see those pictures of tumorous, sooty lungs the Cancer Society uses to scare people away from cigarettes? That’s what some of these docbases looked like on the inside. Not fun. Not pretty.

After triaging a hundred docbases, I discovered some of them also had corrupt r_folder_paths, although in smaller numbers than the original-problem docbase. Thankfully it didn’t interfere with the upgrade procedure. I made a few more notes before continuing onto my next engineering task, the arduous quest to get Documentum to fix the horribly-broken-under-Sybase content replication process.

Another year passed, I was passed along to another business unit, and a new folder-related problem emerged. Business users (in webtop) reported empty folders and missing documents. The document management group (in desktop client) reported the documents were right where they filed them. If you guessed those empty folders had corrupt r_folder_paths, you were right. Webtop uses a slightly different query to determine folder contents than desktop client and as a result couldn’t “see” the contents of such a corrupt folder.

There Will Be Pain

You may want to take some ibuprofen before continuing …

This particular client as a rule doesn’t allow anybody to talk directly to the Sybase DBAs. Don’t ask why. Healthy docbases require attentive, knowledgeable DBAs working with developers and docbase administrators. I would probably have figured this problem out by year one if I’d had that kind of healthy relationship, but I didn’t.

Some suggested I master Sybase. Frankly, that’s wasted effort since it’s such a small part of Documentum’s market. If this were Oracle or SQL Server, sure. Thing is, the client desperately needs DBAs cross-trained in Documentum if they were going to tame the growing performance and stability problems. Despite getting a few people in management to shake their heads yes when I said this, nothing ever came of it. Sigh.

So I found the next-best thing to a current Sybase DBA, an ex-DBA in my part of the organization. I started asking questions, and he casually mentioned the “obvious” fact that the Sybase API strips all leading and trailing whitespace from VARCHARs passed to it. I was dumbfounded.

Call me a purist, but this is data corruption by design. If I programmatically pass a string to a database, I expect the exact same string back when I ask for it. That’s just common sense–unless you live on Planet Sybase–but it didn’t really explain the problems.

Most of the affected docbases either had custom scripts or Input Accel adding content, and they all had several layers of folders that related to content metadata. This was the second big clue; both methods of adding content had to sometimes create several layers of new folders to file the first document for a particular client, counterparty, or deal. That’s still didn’t explain it all.

The DMCL caches metadata for newly-created, fetched, and modified objects. That includes folders. It turns out that the cache doesn’t automatically refresh from the database after a save. Now we have the three components needed for the impossible to happen. Let’s walk through the process.

  1. Open iapi and connect to a docbase.
  2. Create a new folder “B ” whose name has a space at the end (OCR, bad typing, etc.), link it into cabinet “A”, and save it. The cache has object_name as “B ” and r_folder_path as “/A/B “. Sybase stores them as “A” and “/A/B”.
  3. Create another new folder “C” in the same session, link it to “B”, and save it. It fetches “B”‘s r_folder_path from the cache and generates its own r_folder_path as “/A/B /C”. Note the space after “B”, and also note that sybase doesn’t have anything to trim because the space is within the string.
  4. Put some stuff in “C” and close this session.
  5. Navigate down through the new folders in desktop client and “C” has content.
  6. Navigate down through the new folders in webtop and “C” appears empty.

Both client applications use “r_folder_path” in the query to find a folder’s content. One remembers and builds the path with some whitespace trimming as it goes along, the other uses literal values from r_folder_path. Can you figure out which is which? That difference is the root of the disappearing content mystery.

The duplicate folders mystery arises from the fact that the DMCL won’t remove corrupt r_folder_path entries when relinking a parent folder triggers a cascade update on its children. That junk stayed in the r_folder_path of the moved “Reports” folder but didn’t block scripts from creating another “Reports” folder there since the validation is on i_folder_id. Both end up with the same corrupt path, as would a third after moving the second, and so on.

There you go. A combination of the DMCL cache, Sybase trimming whitespace, a DMCL application building several levels of folders, one of the non-leaf folders having trailing whitespace, and relinking predecessor folders causes data corruption that manifests itself in several different ways. Fighting other fires, Sybase strangeness, and the improbability of it all kept everybody guessing for years.

By the way, I couldn’t retrieve the API code from the client that demonstrates the problem, but savvy readers should be able to reproduce it.

What should developers do about this?

In custom DMCL code, either trim the whitespace yourself before setting the folder’s name or do a revert/refresh on each folder object right after saving it. I seem to recall that was advice often ignored in the early days of DMCL programming.

In Input Accel, if you can’t hack it with VB or something to do the same, it’s time for a dm_job running dm_fix_folders as often as necessary. Running it regularly would be a good thing for everybody to do actually.

I think the DFC didn’t manifest the problem, and that may explain why I couldn’t reproduce the problem with webtop or Desktop Client–that or those apps trim whitespace along with other client-side validations.

What should Documentum and Sybase do about this?

It’s probably unreasonable to expect Sybase to simply stop trimming whitespace since it may break many systems depending on the behavior. They could however add configuration options to turn off trimming at the database and session levels. Then Documentum could provide options for both on their end, maybe in server.ini. Some installations may depend on standard Sybase behavior when sharing or exposing registered tables to other Sybase instances, so a session option would be polite.

Until and unless Sybase gives the docbase server or DBA some way to deactivate the offending behavior, Documentum could do the same things a DMCL programmer would do explicitly in the DMCL. I still can’t stand the idea of an API trimming whitespace, so I’d prefer they force a cache update on a folder object after a save. The representation in the persistent store should take precedence over a cache to assure consistency with other processes that may have pulled a fresh copy of their own.

Does SQL Server have the same problem?

I never got around to testing SQL Server. Being a Sybase derivative, I wonder. Oracle definitely doesn’t have the problem, but I’d appreciate if somebody with a SQL Server docbase would test this out and post a comment.

Hoover Dam Projects

Hoover DamI heard that my first Documentum project still lives despite three post-merger attempts to decommission it. It’s still my benchmark project because it was a real “Hoover Dam” from day one. That’s the kind of thing I don’t see anymore, real contenders for the 00000111 Wonders of the IT World involving Documentum. Sure, capturing the entire email stream of a 100,000-employee company will be impressive in a Rain Man sort of way, but we were eating up terabytes of disk and database back in 1998–back when millions of objects really meant big.

I completely lucked out after the Great Commodore Purge of 1993 (a 75% layoff with exit interviews held en masse in the cafeteria) by landing a job on a big project in a new technology for a great company with some of the brightest non-Commodore people I’ve met. So began my four year apprenticeship as software systems architect. Our team motto was “not for the faint of heart” which translates into the mother of all learning experiences.

Maybe this was just the state of Documentum in the 1990s; I knew of similar projects all over the Northeast and heard rumors of other interesting things along the Pacific Northwest. Some were about huge repositories. Others were about authoring complex documents. The best ones were enterprise-wide efforts where Documentum and (SG|X)ML intersected. My first Documentum job was all three.

Now I’m looking at my prospects as my back-to-work deadline approaches, wondering where those great jobs are now. There don’t seem to be as many, and I have a few ideas about why that is.

Money is tight and the government is saying “Jump!”

There’s plenty of Documentum work out there, but most of it is coming out of tightening regulations and compliance crack-downs. That’s not inherently interesting because it’s not about creating new intellectual property and doing interesting things with it. It is, however, mission critical.

Money’s been tight for years. The economy has been in an overall slump since 9/11, falsely propped up by creative monetary policy and a stampede away from real industry to finance. That industry’s had its share of problems lately, but even Big Pharma–companies in the business of selling life–have suffered financial and regulatory setbacks. Don’t forget extra-tight IT budgets because of the unrealistic expectations offshore software development promised–and promptly failed to deliver. Businesses (other than Apple, Google, and Microsoft) don’t have buckets of cash to throw at every big idea that wanders by.

These two facts mean that projects must really compete for dollars from the capital budget. Unfortunately, those less-interesting records management projects are scarfing up the bucks because they have one adaptation that trumps all others–a government mandate with well-defined, severe penalties. The bright side here is economies turn around and records management projects get done. Things will eventually get back to normal.

Business thinks it knows what Documentum is.

Documentum has enough mind share that people think they know what it is. They don’t. Perhaps they know what a particular piece of it does. Maybe they’ve bought into some of the counterproductive marketing from Documentum (the company) and EMC that recast the product as web content management or a way to sell more SAN. Worst of all, they may think they know what Documentum can’t do because of failed projects at the hands of integrators of questionable talent or intent, off shore and on.

These experiences, even just the fact of having so many experiences, aren’t going to attract the visionary, the radical, and the mildly megalomaniacal. You have to be a little crazy and totally overconfident to build a Hoover Dam or dig a Panama Canal. These people may already be a few software products beyond Documentum. They don’t notice business as usual, and that’s optimistically what Documentum has become.

It’s really up to EMC to get out there and evangelize Documentum as state of the art for more than their comfort zone. I would evangelize more, but that lack of a developer license to explore new features hamstrings me. IQs probably aren’t going to rise suddenly at EMC, so this problem’s here to stay.

There’s very little frontier left in document management.

In 1994, electronic document management was a frontier. The problem space was huge and mostly filled with dark matter problems, the kind you don’t know you have until you’ve solved simpler problems obscuring them. Only a few people had even considered what routes lie from problem to solution; a few of those people cashed out of Documentum years ago. Here were the raw materials for all kinds of frontier dramas, from Little House on the Prairie to The Donner Party. My kind of stuff.

Is there still room for those glorious victories–and equally glorious failures–that made the first decade of Documentum so much fun? I’m not ready to say “no” just yet, but something needs to give the problem space a really good shaking before I can answer “yes”. While the product’s certainly changed since the early 1990s, neither it nor any of its up-and-coming competitors have hit upon anything truly revolutionary. I’m not even sure what would count as revolutionary anymore, but I’d love to see some suggestions in the comments.

Are the interesting projects just lost in the crowd?

My final hypothesis is a little less pessimistic: Maybe those great projects are still out there, but they’re hard to pick out in a sea of so many other less-interesting projects resulting from a maturing market.

I’ve had words with a certain somebody about when document management is “enterprise”. She thinks it means everybody in the whole organization uses it; I’m fine with the term when it cross-cuts business aspects like department or time zone even if only a fraction of employees use it. We may be moving towards a world where she’ll be right, and the proof might be MOSS 2007.

I still haven’t touched it personally, but I did sit through a few hours of demos on Microsoft’s website. Aside from inheriting architectural problems from Sharepoint 2003, there is the potential for real document management here. It’s hamstrung by some of the usual Softie assumptions like “one document = one file”, but it’s a start. Brace yourself, because I’m the most optimistic about this case despite the Evil Empire influence.

Sharepoint 2007 could lay the foundation by really helping users “get it.” MOSS’s pretty face and stripped down functionality might entice where Documentum has traditionally overwhelmed. MOSS becomes a stepping stone to Documentum instead of a replacement. Those smarter users could become the fertile sea for a new cambrian explosion in the field that will make it fun and interesting again.

An Aspect of My Disappointment

I found EMC’s webinar, Leveraging Composer and Aspects, informative given the format and one hour time limit. Composer is the Eclipse plug-in and looks good overall. An aspect is a software design technique which could revolutionize Documentum data architecture. How EMC implemented aspects, however, repeats their architectural mistake with DBOF. Oh well.

Composer looks competently-enough done to be a welcome replacement to the persnickety Documentum Application Builder (DAB). The look and feel (and clutter) is pure Eclipse–no sexy Apple minimalist influence here. The win here is one-stop shopping: Developers won’t split their attention (nor workstations their resources) across two programs. That’s definitely a Good Thing.

Aspects in theory are great. They allow per-instance extension of both state and behavior, so objects get what they need when they need it. The reduction in type bloat alone is worth it. There’s no need for all those “optional attributes” just in case a particular instance becomes a document of record or gets included in a submission. Fabulous. In happy wonderful fantasy land, I’d scrap dm_document completely for a true lightweight object, then staple on versioning, lifecycles, web content, retention, and whatever as needed.

Aspects bring some real architectural benefits, too. Adding an aspect at runtime doesn’t require ALTER TYPE activity which can crap out the database on some really big (or really underpowered) docbases. Older docbases will also benefit from this easy way to add and remove new functionality without the potential dangers of hacking around the existing type hierarchy. Documentum architects now have a built-in way to do composition, a technique that is largely replacing inheritance in OOP design theory because it keeps the design open and adaptable.

Aspects in practice aren’t so great. My fear that aspects would use the DBOF model of requiring Java code client-side has been confirmed, although they made an effort to minimally support DQL since attributes are involved. (No word on DQL limitations imposed by aspects yet.) Turning the docbase into a JAR pusher has improved the deployment situation, but it’s still pushing functionality too far up the stack for my liking. More moving parts mean more things can fail, get out of sync, or confound.

This design flaw will persist for as long as EMC is swinging their Java hammer, even on screws like scripting. Perhaps EMC catching onto jython a few months back means they’re reaching back into the toolbox. Funny how they had to make a simple script look so big with all the extraneous print statements–very Java of them. If only there were another way to pass messages from a client to Documentum without the DFC. Hmmm.

Here’s what I’m thinking: All custom client applications and integrations should use DFS going forwrd–assuming it doesn’t have its own issues. DFC programming should be restricted to the uber-server, that big box drawn around the real docbase servers, webtop servers, DFC servers, full-text indexing servers, site caching servers, etcetera. Only push DARs–yes, they had to go there–to Documentum application servers. What goes on in the uber-server stays in the uber-server–including every single line of DFC code.

Aspects are too useful to discount for a few architectural missteps–as long as you’re careful where you walk.

Tech Deep Dive: Exploring Documentum Architecture — Leveraging Composer and Aspects
Available soon on EMC Events On Demand (developer access may be required)

When Free Equals Evil

Simon & Hecubus There’s a dark side to the Wired article I mentioned earlier [Microsoft Giving Away Developer Software]. Most people like free things, but when is free (as in beer) a bad thing? When somebody’s trying to get you drunk and take advantage of you. Wired touches on how Microsoft is using the free bundle in its war against Adobe’s Flash. I don’t know enough about either Silverlight or Flash to argue relative merits, but I do know Microsoft well enough to worry that they may succeed on purely non-technical grounds. After all, they are the master strategists of “free equals evil”.

Marketplace Squeeze-Out

Remember Netscape? (You get bonus points and a free bottle of Geritol for you if you remember Mosaic.) They made a business out of a web browser, and that eggs-in-one-basket strategy certainly contributed to their decline.

Microsoft successfully drove Netscape out of business by including Internet Explorer with Windows for free. The first IE was horrible, but its icon was there on everybody’s desktop, begging to be clicked. Don’t underestimate the compulsion to click that shiny, candy-red button when it’s right in front of your face. People don’t wage years-long legal campaigns to remove desktop shortcuts out of a sense of aesthetics. IE improved with time, and consumers weren’t as quick to whip out their credit cards for Netscape. That hurt, but it was the loss of corporate sales that deprived the one-trick pony of its bread and butter.

IT professionals had a harder time arguing in jargon-free terms why management needed to pay for another program when there was something free on every machine they bought. The truth is, IE wasn’t free. Its direct cost was built into Windows and every PC where Microsoft’s other squeeze-out tactic was working brilliantly. Exclusive contracts with hardware manufacturers gave them an unassailable beachhead in the Browser Wars.

Every Windows user is still paying indirectly for IE: ActiveX was the cluster bomb that unpaved the way for our current security crisis, but it wasn’t the first time Microsoft compromised a product’s integrity for a short-term win. Other IT dinosaurs may remember the stability hit NT 4 took because Microsoft pushed the graphics handling up the stack and allowed it memory-protection-free access to the executive layer. I hate performance shortcuts that diminish a system’s stability or integrity.

The strategy worked; Microsoft won the Browser Wars, and it took the industry almost a decade to climb back out of that hole. Thanks largely to Firefox and Safari, browsing the web is evolving again. I’m personally enthralled by Safari 3’s ability to move tabs around within and between Windows. Hmm, I suppose I don’t mind the monopolies that give me the best product, hence my love-hate affair with Apple.

Hearts and Minds, Bait and Switch

Would it surprise you to know that Microsoft promised a full XML-based format for Office documents back in 1997? It surprised me when I attended an XML conference in Seattle. I also got to meet Larry Wall, eat expensive sushi on the IE product manager’s tab, and met a good friend that indoctrinated me into the ways of perl mongery, beer, and the Philly food scene. There was also a second incident with fried shrimp heads, but that’s better not mentioned in polite company.

The ongoing controversy about ODF is the last act of a decade-long attempt my Microsoft to coerce an open standard to their own ends. They’ve done it with other things, but never before with the quiet persistence of wrestling XML away from an open standard and into another proprietary format for holding people’s documents ransom. Oh, and don’t you think Adobe’s not doing the same right now, but they’re a lower-case evil compared to Microsoft’s all-in-caps EVIL.

Their strategy wasn’t unlike the grassroots Christian Fundamentalist takeover of the US government in the 70s and 80s. Instead of school boards, Microsoft seeded open standards committees with their people. I won’t say these people were active agents of evil; some seemed truly interested in doing the right thing in the free-as-beer sense. Maybe a few of them were sleeper agents waiting to go off, but others were just good people that would eventually get squeezed between their two masters, the open ideal and their leash holder.

Microsoft’s first few attempts to generate XML (and even HTML) from Office documents were, well, horrible. It was a half-dozen years before Office 2003 could generate something better than a well-formed mess with more tagging and attributes than content. I’d already developed a system that converted Word documents into rich SGML for SB, so I know that it’s a difficult but solvable problem. My talents aside, it’s hard to believe that however-many Office programmers working within Word’s source couldn’t accomplish in six years what took me about one. Clearly there was a lack of motivation on somebody’s part to solve the problem well.

The markup and codes in Word are all presentation focused and lack any sense of context beyond range, paragraph, and document. A useful (SG|X)ML document has a structure, expressed as a DTD or XML Schema, and depends heavily on context–especially for output specifications, expressed as CSS, FOSI, XSL/T, etcetera. Supporting this complexity would require a fundamentally different Word. That would have a leveling effect since the first few versions would lack features, have bugs, and have to play catchup to some of the better structured document editors of the day. Before even being EVIL, there’s a self-preservation issue here.

Thing is, Microsoft only appeared to plod forward clumsily toward the promised XML-native office formats. They had a two-pronged attack on the open standard. The first prong was the format bait-and-switch that worked so well in the Browser War: Embrace the standard, then start saturating it with custom features to support functionality only found in Microsoft products. This didn’t surprise anybody. What surprised me was the second prong, a legal maneuver claiming copyrights on the schemas that define the “open format” documents to effectively ban compatibility in competing products like OpenOffice.

With everybody more concerned about web pages and records management lately, I haven’t been paying as much attention over the last few years. A recent Ars Technica article [Analyst Group Slams ODF, Downplays Microsoft ISO Abuses] includes the whole cast in media res. What I thought was the last act was just the end of the first part of a trilogy with part two titled “The Two Formats”. I also saw something about Microsoft trying to create a PDF competitor; I might actually pop some popcorn and plop down on the couch to watch that one unfold. It sounds like King Kong Versus Godzilla!

Free Now, Not So Much Later

The common thread here is that what’s free now can cost you later. When the free beer is flowing, make sure you either keep your wits about you or have some good friends watching your back or taking your keys. Otherwise you may wake up in an alley with your pants down around your ankles and a big Windows logo tattooed on your ass.

When Free Equals Profit

BeerHey, EMC, pay attention! Wired reports Microsoft is going to give free copies of its software development tools to students. Even Microsoft gets how giving away something for free now can equal profit later: Flooding the market with graduates already trained on their products will encourage businesses to choose those same products. I’ve been shouting and begging for a developer license program from Documentum for ages under a similar argument.

I want Documentum to provide an affordable, if not free, single-user developer license for their products. I mentioned before how big Documentum is and how my clients shape my knowledge of Documentum. No one project and no company I’ve seen so far uses the entire suite–nobody probably would because some of the products are industry-focused. I have to choose carefully when accepting contracts since that’s my only legal access to Documentum software. Each contract shapes my marketability for the next, so I sometimes pass up otherwise good, interesting work when it’s dead-end or same-old same-old from the Documentum perspective.

The parallel between what I want and Microsoft’s deal for students has to do with the affordability gap. The average college student can’t otherwise afford licenses for Studio, SQL Server, and the other half-dozen tools. Microsoft doesn’t need to give independent developers a free ride because they have a developer program that gives all that and more for a price that the average professionals can afford–and write off on their taxes. Many of my Softie peers do just that, but I can’t do the same thing with Documentum. The regular cost of the product is geared to large companies.

If I want to learn about a particular aspect (hint, hint) of the product, my choices are limited: (1) Pirate the software, something I oppose strongly from my days at Commodore. (2) Use a client’s software for my own purposes on my own time, but my contracts usually don’t allow me to profit from such work or even publish them on my blog. (3) Influence clients to use the technologies that interest me, an ethical minefield as well as risky for a project’s success. None of these are acceptable, so I (4) try to pick projects that give me what I want. However, clients don’t want to hear, “I really want to work with technology x, and you should hire me even though I have no real experience with it.” It’s a huge marketing downside that I personally can offset with other skills and seniority. Most developers can’t.

That’s how Documentum (the company) retarded the growth of its developer community, by keepings its cards too close to its chest. Things have actually gotten better under EMC’s control; the EMC Content Management Developer Center is a good supplemental resource that gets updated frequently and tries to foster a sense of community. However, it’s no substitute for having the product in my own two hands. Hopefully one of those Microsoft fans at EMC will take note of their pop idol’s action and emulate it. Boy, isn’t that a strange thing for me to wish for?