Sunday, December 9, 2012

What is Information Management Progress in Healthcare?

Progress starts with learning from previous mistakes by instituting solutions and rules which prevent the same mistakes from repeating themselves. Of course the conditions that lead up the mistakes can change so the solutions and rules must be fundamental enough to deal with them. So, where is information taking us? Where specifically in healthcare?

In Healthcare we have, medical research data, patient data, what the medical field does with the patient, regulations which produce forms for the patient to sign, codes which need to be matched to procedures for billing, analysis which checks the coding accuracy, audits, etc. There are many information systems which interact with each other.

A question to ask a Healthcare IT department is what can be done to eliminate the integration parts of your information processing? That is, why do so many software solutions need to be integrated through a service layer which is basically acting a broker between point A and point B. Why are properties of information being “tweaked” during this process? Is there such thing as information integrity?

Another question to ask is do friendships among IT folks affect professionalism? That is, when a database outage affects data integrity in applications that rely on it, was the DBA held accountable, or was it deemed unavoidable between friends.

Healthcare information will not progress to better integrity when the IT department at a facility is just at work for the day job. To really reform this information and get back its integrity anyone has to feel discomfort, everyone has to change the way they think about their jobs. Will this happen quick enough to give the best care possible to patients who need accuracy and privacy? We’ll see.

The changing of CIOs every few years at the hospital system will help, as new ideas are sorely needed in Healthcare IT. I don't have enough experience in the Healthcare field to know if there have been a series of "progress traps" in the past (I'm sure there has been), but I am ready to help push progress forward a notch in hopes of giving patients what they deserve, especially for the money that is spent on their care.

Saturday, November 24, 2012

When a DBA holds all the cards

As we move through the life cycle of content management projects, we deal with project managers, internal and external vendors, technical architects, storage managers, and database administrators (DBAs). Sometimes DBAs are reactionary, sometimes they are forward thinking and actually care about projecting growth of data.

You could spend weeks with the software QA department trying to tune the queries that are slow only to find out that the database is maxed out, or the software is tuned for MS SQL over Oracle, or vice versa.
During major upgrades or migrations, DBAs need to be on top of their game, they need to anticipate a jump in database size, RAM, and CPU utilization. When they get caught not doing their job is when performance slows and users start complaining. This puts you in the middle, having to explain to the users and project sponsors that, “yes, you allowed for this possibility”, but, “no, we are still looking why this happened.”

DBAs can either go into cover your ass mode, or fess up to their oversight and make plans to fix it. The tell tail sign that you are being stonewalled is when you ask a question and you never quite get a straight answer. Sometime you get an answer by way of push back, like, “what queries are you guys running anyways”, or you get long responses that tout their expertise and knowledge, but still don’t answer the questions of performance.
So how do we light a fire under the DBA to boast performance that is obviously being affected by over taxed RAM and CPU? We could escalate the issue to managers, which could work, if the management is accountable and works well with one another. We escalate the issue between the users and the DBA, which would put the two parties most affected by the performance in the same run to describe the pain of waiting for results.
The bottom line is that we need to include all players in a meeting during the planning phases to make sure everyone knows the ramifications and expectations of the project and to have face to face agreements which are a lot stronger than email acknowledgements.

Sunday, October 28, 2012

Documentum vs. OnBase Part 4: Access Control and Privileges

This is an odd one, it could be a comparison where patents play a role in the underlying design because in OnBase there is no explicit mention of “access control”. Let’s try to build a simple use case to illustrate the differences between the two approaches to control who can do what when to a document.

Use Case: Excluding a reader group from viewing documents in their draft state, then allowing the reader group to view documents in their approved state.


  1. Create a dm_document object type with attributes
  2. Create a permissions list for draft and approved
  3. Create a lifecycle for the document type
  4.  Add two states for draft and approved
  5. Add an action to each state to apply the permission set (or ACL)
  6. Apply the lifecycle to the document upon import (manually or automatically)


  1. Use ACLs inherited from Folder Security and the folder’s permissions
  2. Do steps 1 and 2 and set permissions on folders and use folder security, moving documents from a draft folder to an approved folder


  1. Use ACLs inherited from the Owner’s permissions
  2. Do steps 1 and 2 and set permission manually on the document after


  1. User ACLs inherited from the document type’s permissions
  2. Do step 2 and set permission manually on the document after


  1. Create two document types: draft and approved
  2. Create two User Groups (at a minimum); one with privileges to create, modify, and delete the draft docs, and one with privileges to view the approved document type
  3. Assign the ability to change doc types to the author’s group

Use Case: Make a document revisable


  1. Create an ACL with Read/Write/Version permissions
  2. Apply it to the document


Change to privileges on a group that is linked to the doc type

Thus, in OnBase it is routine to change doc types as keywords are shared across the types, which is taboo in DCTM. However, in DCTM you have three distinct ways to inherit permissions. This is more flexible. The overall difference in permissions is that OnBase is group privileged based and DCTM is primarily object permission based. Groups are a lot more configurable in OnBase which lends itself to easier setup from a User’s perspective. In DCTM, the document object has its own attributes and is linked to ACLs and Lifecycles. In OnBase, the document is more of an equal player in the system.

Sunday, October 14, 2012

Documentum vs. OnBase Part 3: Scan/Index workflow

In Parts 1 and 2, we looked at configuration and relations. Now let's look at scanning and indexing using Documentum's tools vs. OnBase's.

Documentum offers a wealth of tools and functionality for scanning, indexing, and validating paper. The huge difference between Documentum and OnBase is that the OnBase user interfaces are integrated into the scan/index modules. You don't have to index in one UI, then switch to another for further processing like you do with Captiva products. EMC bought Captiva but did not fully integrate it.

The concept of a interoperable modules was probably not considered as the most important aspect of the acquisition. However, as these software acquisitions play out over time, you can see the long term negative effects of not integrating the products. I believe the days of buying software to gain market share only are gone. It may be profitable in the short run, but these purchases need to be well integrated in the long run to be viable.

Invoices: Documentum
Let's say you use Input Accel to scan and index invoices for example. You will have to have an approval workflow and an auditing workflow. Which UI is used for the approval workflow, Input Accel or Webtop or xCP, or D2? Which systems tries to match the POs by invoice number, receipts, and total amount, or is this custom? I have seen some of the xCP solutions for this activity and they are complex and expensive and custom to the general ledger integrations.

Invoices: OnBase
Scanning and Indexes are configured and executed through OnBase's Clients. There's no patch work of double checking and exporting, there's just one interface called "Import > Scan/Index", that's it. You set up the scanner, give the workstation rights to scan, configure the document type and scanning queues, and you're good to go. Oh, and the workflow can be configured to execute after you commit the batch. It's all in one and takes about a 5th of the time to configure. It scales as well.

Let's face it, EMC treats Documentum as a repository for dumping content. It cares about amassing storage first, then figuring out what to do with it, eventually...

Sunday, October 7, 2012

Documentum vs. OnBase part 2: Relations

In Part 1, I looked at configuration between Documentum and OnBase. Part 2 focuses on a few relations which exist in each software suite. I define relations as any object which is associated with another object.


DCTM’s at the core has cabinets and folders for System, Resource, and Temp folders among others. OnBase does not show any folders. The first thing you realize when you use the client is that it shows doc retrieval functionality for doc group types, doc types, dates and keywords. I found myself clicking around for folders and felt disoriented. This boils down to the fundamental navigation difference between the two: DCTM shows folders as a base, whereas OnBase shows search initially. To dig a little deeper, we can see File Cabinets, Folders in OnBase, however the configuration is a lot different. In DCTM, the model is fixed base on creating a new folder type which has a supertype (dm_folder). In Onbase, the folder type is part of a hierarchy which inherits properties, but these are not exposed. You just create a parent folder type and a child folder type.

Scan Queues

DCTM bought Captiva and integrated it into Documentum. This integration is poor at best. You use Input Accel to scan and index then export into DCTM. The whole idea of exporting into DCTM tells you how little they are integrated. In contrast, OnBase, at its core, was built with image scanning integration. This means you configure scan queues with doc types and keywords and start scanning. No jumping between licensed product hassles with Input Accel and indexing and QAing in one interface and validating again in DCTM.  OnBase scan queues are configured with indexing and processing in mind, meaning that it is very straight forward to create a queue, associate a workflow, and assign doc types, plus access control. One huge advantage in OnBase is that a document like an invoice can be invisible to ordinary users, but in scan queue for indexing and QA it is visible. Tell me you don’t have to jump through hurdles to achieve this in DCTM…


In 1999, DCTM was the only content management tool which touted the concept of document lifecycles. Now most do, including OnBase. The evolution of lifecyles in DCTM, however is interesting in that it had LCs before workflow. DCTM LCs were designed with a lot of doc processing power (actions) in them and workflows pushed docs through them: a doc in DCTM can have one LC and it’s a pain to reassign it. In OnBase, Lifecycles consist of workflow queues which have tasks and actions associated to them. You can have multiple LCs associated with a document as part of a continuum of potential tasks and actions which could be performed on it.

Workflow Folders

In Process Builder in DCTM, you can configure folders as attachments to workflows. You can link attachments to those folders as they enter into the workflow. TaskSpace allows you to view the folder contents, and the form associated with the task, and even has workflow queues. It’s just that the time it takes to build the TaskSpace application with action buttons, document settings, folder contents settings, the Process Builder workflow with forms and tasks and integrations with TaskSpace, is about 10 times as much time as it takes in OnBase. OnBase was built with lifecyles and workflow in its core configuration. 

Tuesday, September 25, 2012

Documentum vs. OnBase part 1: Intro

Anyone who knows how to read a software feature matrix can compare line by line the details of functionality between two ECM systems. What I’d like to do is hone in on aspects of these comparisons which are fantastic or frustrating, and in some cases, mindboggling.
First, we’ll look at configuration, then we’ll look at using the tools.

Config: Document Types
In both OnBase and DCTM software packages, doc types run the UI show, that is, they function as the core of metadata, permissions, interoperability, functionality, forms, etc. When you do anything with content, you start with doc types. The underlying database and application designs for each app shows itself in the doc type configuration differences. Besides scaling and performance considerations, I’m guessing some copyright patents mess with the best ways to design certain functionality as well. I will show you some of these later.

In OnBase, there is a doc type group and a doc type, that’s it. You cannot create a hierarchy of types like in DCTM. This has many ramifications good and bad. This limits the potential for complexity, which plagues some DCTM object type designs. However, you can’t use doc types as an organizing factor for more than two levels of categorization in OnBase, which is annoying in medical records when you have hundreds of doc types and one doc type group called, “medical records”. Picture constantly having to scroll through hundreds of doc types for each search, retrieval, scan queue, etc. It’s not productive.

Change Management: A huge plus of OnBase is that you can version a doctype. This allows you to add/delete/change metadata and move on with the modified doc type definition, leaving the previous version with the old content. This is huge when you consider records and auditing. In DCTM, you would have to modify the existing doc type which would change all the previous content, or create a new doc type and migrate the old content. A real hassle if you've ever done it.

Config: Metadata
OnBase’s label for metadata is “keyword” which is confusing to most ECM folks as it can be associated to the Dublin Core metadata standard name of “keywords” which means any related term to the content being described. So, OnBase’s “keyword” really means metadata (attributes, or properties) of the document. In both, we have system level metadata like file path, size, format, name, etc.

Change management: Onbase has cool feature which allows you to set multidoc keyword changes on a keyword. This allows for changing values of a metadata across all docs. Of course, you can do this in DQL. DQL still remains a powerful tool for the right person, and a dangerous one for others.

Part 2 looks at relations in Document and Onbase and compares some of them.

Monday, August 27, 2012

Alienated from Users

For the past few years I had been working on design and upgrade projects which were well documented from a requirement's and functional perspective. My tasks were to work out the technical aspects of the solution. I read the requirements and specs and got about 70 to 80% of the context of the project. In this case the project was to upgrade an integrated pharma based DCTM system from 5.3 to 6.5sp3.

This alone was no small task. I was heads down in WDK weeds for weeks. After hashing out all of the version differences in the customizations and performing the upgrade, I realized that I really had no idea what the content was, what lab techs did with it, or marketing, or really anyone. I just helped with technical aspects and went on my way. However, a bad after taste after the project made me think about what motivates me to help big pharma, big finance? Why am I facilitating these large greedy companies and their investors to become more efficient at making more money?

This guilt shows in how DCTM labors along at these large installations. There is a malaise in the program management of EMC projects. The clients are going along with it, but in more and more cases bailing to SharePoint because the client can have more control and pay less money (at least in the near term). Who are these clients, these Users? What problems are they trying solve? Why do they put up with shitty UIs? Adolescent configuration tools? Complex integration methods?

How did pharma get so caught up with complex 21 CFR procedures when most of them self police any possible audits anyways? IT in these places are a house of cards, but fortunately no one is a whistle blower, yet. All this made me alienated from DCTM, it's big installs, and especially sequestered Users.

I am now working with Hyland OnBase in the healthcare industry, which is a breath of fresh air. Come back for comparisons between OnBase and Documentum.

Tuesday, May 1, 2012

ECM: Too Big to Fail (or Succeed)

Why did Easter Island Implode?

So what do you do when your client says IT wants to cut infrastructure costs by consolidating all of their separate repositories into a monolithic one? On paper, using EMC’s sizing guide and empirical data (number of objects and size of content) the consolidation may make sense. The following is a partial list of areas that describe the dimensions of this decision:
  • Performance and Scale Issues
  • Political Fallout
  •  “Shared” Services

Performance and Scale Issues

The average size and number of content files is the foundation of this analysis, plus amount of processing and messaging between users. The EMC Sizing guide helps with the initial infrastructure logical architecture. The guide’s recommendations can then be applied to future growth of content and user projections. Content indexing and search, and security are essential factors to the future of the user’s experience. If security, for example, is complex, then search results may have a tough time processing through multiple layers of identity and access control. If content files are larger than anticipated then the indexing may take a lot longer when reindexing for upgrades. For backup/restore, failover, and disaster recovery, complexity is increased for the “routine” patching and upgrades.

Political Fallout

For medium to large content repository implementations there are considerable risks for internal turf wars, for example, IT’s top ten projects set the priorities of the IT’s services, however, if your business unit is not on the top ten, good luck getting the resources you need to complete your projects. So if the consolidation happens and a pool of money is paying for all of the ECM solution, then it becomes a political tug of war to get your work done. Downtime windows for even small fixes become bogged down with “red tape” from a group that has been burned before and is consumed by risk avoidance. Security can be an issue to, for example, if some content for HR is extremely sensitive which mandates that HTTPS is used instead of HTTP. This slows down content transfer and overall performance which in turn requires more hardware to speed up.

“Shared” Services

The concept of sharing is that everyone has an equal portion of resources to get their work done. This model works for IT because they need excuses to fall back on when projects are delayed. The portfolio model (like ITIL) fits neatly into shared services in that work is identified and prioritized, however the small projects get constantly pushed back and they add up. The business becomes frustrated but has no leverage to fix the situation. The success of IT should be measured, not by the execution of its big projects, but the attention it gives to the many smaller projects that usually languish and never see the light of day.

Saturday, April 21, 2012

The ECM App And the Turtle

Four years ago, I was a member of a consulting team who was trying to pitch their services to a large Pharma in New Jersey. It was for a clinical trial Institutional Review Board (IRB) solution. We had designed and developed a Webtop application based on compliance manager (Taskspace was still too buggy at the time to show). Our audience was a mixture of IT managers, architects, and one lower level business representative responsible for clinical trial coordination.

We were not prepared. We did not understand the requirements for an IRB. The first thing that our audience brought up was that they did not want to sit through any more basic Webtop or Compliance Manager demos. They had just seem an IBM demo which focused on the IRB functionality and proposed solution. All my work was out the window.

This was four years ago. Even now EMC Documentum does not have a core product to solve the IRB communication and content management issues. Why is this? I know consultants have created one-offs to this, but don’t these “xCelerate” into Documentum’s core offerings? Slide decks only go so far. Dog and Pony shows wow the first time. Real solutions which are designed from the users themselves are not getting recognized. They are being coopted by smart and motivated users who have tech savvy connections.

ECM applications in general have the base of offerings which give them a few year of advantage. For example, there’s an open source clinical trial solution which touts the ability to create and manage users and security, and control the UI configuration. This product has been improved over the past four years, but the point is at its inception, an ECM application could have been built on top of its core products which easily surpassed the open source and grabbed market share. Are ECM apps that hard to innovate? 

Thursday, April 19, 2012

Do Partners Use DCTM products In-house?

So why is it that most Documentum professionals do not use Documentum products to manage their own content? Wouldn’t you think using the products that they tout as superior for their own use would be a no brainer? Even EMC doesn’t exclusively use its own software for content management! What’s the deal?

If we can understand the reasons why, then we might be able to figure out some core issues with the product suite. Using only DCTM software should be requirement to becoming an EMC/Documentum partner, but isn’t. What’s wrong with this picture?

Time is money
It takes “non-billable” time to build out the infrastructure, install the content server, application server, applications, then to configure the applications, users, roles, security, and so on. Management of partner resources is based on greed, not product enhancement. The focus is on exacting value from client’s basic content management needs, not leap frogging the product flaws, by building great products.

No Rules, no need
If partners and consultants had strict rules and were regulated, they would think twice about not using DCTM products. They would weigh the options of not using DCTM and realize that maybe compiling to its inherent rules in not a bad idea.

The carpenter’s house
Like a carpenter’s roof, consultants are notorious for not taking care of their own back yards in terms of using the tools and knowledge to fix things where they live. They build palaces for clients and come home to trashy trailers.

The configuration and ease of use of Sharepoint is a huge and obvious issue. But do consultants create an uproar about this? Not really, most build both solutions and some have even switched to all SP. This is sad.

Share drive comfort
The share drive mentality is still strong in most companies. Why switch if the options are not compelling enough? For records retention just hold onto the content 8 years from when it was last modified. Done.

There is hope with the xCelerator movement that Documentum might be trying to focus on a few right things. One is stressing the “product” on top of the product for solving specific vertical business problems. But you can only go so fast in a Maserati when the road is curving and has cracks.

It’s time to consolidate the product talent and build a completely new stack (not 7.0: 1.0 again, this time leapfroggin). This stack should revolutionize the creation and management of content. Make the partnerships with other companies, use the cash reserves, get in the game!

Thursday, April 12, 2012

D2: "Reality Distortion Field"

Having read Steve Job's biography and how he applied his "reality distortion field" wherever and whenever needed, I started thinking about Documentum's D2 purchase. What is the purpose of this new, "easily configurable" UI? Isn't just trying to divert attention away from the fact that Documentum has not been able to revolutionize the ECM market the way it could?

The core content server is still there. But, because Webtop was out for so long, most large companies with complex requirements have customized that hell out of it. When xCP came along the message was "case management" which had most developers scratching their heads? They were trying to figure out how to go from Webtop to TaskSpace. Ok, they finally worked that out, but didn't implement it. Then D2 comes out with a wink and nod saying it's easy to configure and don't worry about your Business Object Frameworks (TBOs and SBO), they will work. Ok, prove it!

Here's where the reality distortion comes in: what EMC needs is a flexible, change oriented layer for businesses and creatives to build vertically focused products. Yeah, EMC wants partners to certify their solutions and call them products, but I'm talking about a platform that blows away Microsoft's Sharepoint, which is just a web version of file sharing at it's core anyways.

So, what's it going to be, a distortion field or a completely new product from the ground up? I'm sure it is part of the reason that Newton left Documentum to build Alfresco. EMC has a window of opportunity, will they take it? 

Sunday, April 1, 2012

From Form and Process to xCP

Inherent in a form, paper or electronic, are the fundamental components of an xCP application. The following are some of the possible components:


  • What is the overall intent of the form?
  • What will it to ultimately accomplish?
  • Is the form one step in a multistep process?
  • Is the form meant to automate a previously labor intensive activity?

  • Each field has a purpose toward the goals of the form. 
  • Who is sending it? 
  • What is it for? 
  • When is it due? 
  • How will you know it has been received? 
  • When a field has a format or regular expression sometimes this will be explicitly stated, other times the recipient (or system) will be responsible for validating the values of the form. 
  • Who will verify the vales of the form 
  • What will happen if it is rejected? 
  • Who sees this form? 
  • How is the sign off validated? 
  • Who has the clearance to complete the goals of the form. 
  • Are there required signatures? 
  • Who signs off on the form? 
  • When did they sign it? 
  • Is there an assumed process in the form, for example, are there directions on what to do with it? 
  • Is there a recipient of the form?


The tasks of what gets done with a form should be documented and explained in detail.
  • What event initiates the form?
  • Who initiates it?
  • When does the form typically get filled out?
  • How is the form validated?
  • How are the fields on the form used and stored?
  • Who approves the form and what do they do with it?
  • How does the initiator know that the form has been approved?


All processes have exceptions to the general flow of the work. The challenge is to start with enough comprehension in the workflow to automate the most important steps, thus allowing for superior human intervention occur in between. Chunk the process up enough to help people do their work easier, not to impede them. Many times I’ve seen large workflows which become the bane of an office worker’s day.


So you have the form that needs to be automated, the inherent process which is verified by talking with people who fill it out and who process it, and you know the pieces of xCP:

  • Forms Builder: create the form with fields, security, and validation
  • Process Builder: build each activity of the workflow, map the field data, route the form to the reviewers/approvers, complete the work of the form.
  • TaskSpace: create an app, add the forms and workflows, create roles for the users of the application, build out the tabs and add forms to them, assign tabs to the roles and how the application will look to each role.

Plan out the application with each component in mind and how they work together. xCP is nothing new taken separately, but combined is a power tool which in many companies is taken for granted.

Friday, March 23, 2012

Is your ECM Portfolio Management in a Rut?

I recently had the opportunity to meet with a client that had implemented Documentum Webtop 4 years ago. I was part of that implementation. It was like being in a time warp to return after 4 years to see that really no further expansion of the initial roll out had occurred save for adding on lots more object types. This is an obvious sign that a painful gap exists between what the business wants and what IT delivers. Just tossing in a content management system will not fix issues of communication and lack of persistence.

Like the truck in the photo, Portfolio Management is stuck in a rut. It may not even exist. Here are some of the pain points of everyday users take for granted:

  • Not being able to find what they know is in the repository.
  • Wanting to be alerted when a certain document is approved and meets certain key/value criteria: this is a tell tale sign of lack of workflows which would organize and alert groups and individuals as documents are approved.
  • Harboring grievances with DCTM interfaces and tools for years to no avail.
It is easy to blame the content management system for all the woes of just trying to get your work done, but most of the time the blame is on the company's lack of process around listening to what the business wants, how they work, when, with whom, etc. Listening comes first, then documenting, budgeting per business unit (not through IT), managing this portfolio, and, at the end, having IT get it done. Many mid-sized companies do not put too much emphasis on IT portfolio management, let alone content management. Thus if issues arise with content management, IT is blamed, not the breakdown of portfolio management.

The need for dedicated IT business analysts
What happens when business units have a phobia of workflow automation? Nothing, unless IT has a strong business analyst who works with the business to figure out the best ways to automate their headache routines, to detail content and approval processes, to advocate for the business with an eye to the reality of what IT can get done.

Lacking representative governance
Centralized governance for detailing and prioritizing content management projects is key to the success of ECM. The group can not be run by IT as they will bulldoze through projects according to the last technology conference they went to. Business goals and requirements must be the drivers of the projects and priorities.

Technical Architects and CYA
If you are lucky enough to have an IT Architect, hopefully he/she can communicate to both the business needs and technical aspects of them. Many company put IT and architects on a pedestal, which causes conflict if there is a "we and they" environment. If there is a "Cover Your Ass" attitude that usually means that a good chunk of productivity is lost to long winded emails and in battles among IT and between the business and IT. If there's CYA going on, then there's portfolio management dysfunction.

A Road Map to Health
An ECM roadmap, which shows high level goals and detailed ways to get there, is needed for the restoration of trust and communication between what the business wants and what IT delivers. This roadmap should be developed jointly between the governance group and IT with architects and business analyst in the room. If there is a person doing too much talking or pushing around, gently ask him/her to respect the needs of everyone. Having an outside study done would be a disaster and is usually sponsored by individuals who have an agenda that may not be good for governance and portfolio management as a whole.

Wednesday, February 29, 2012

D2 built on ruins

The "new" D2 client from Documentum is similar to how the Native Americans of Chaco Culture built their pueblos on the ruins of previous structures. EMC seems to like to reinvent the UI wheel every few years. To make our day to day lives a little easier. D2 is their latest attempt to make the UI configurable, flexible, and relevant to today's expectations in client software.

Chaco Culture: Pueblo Bonito

I am not going to dive into the improvements of D2. Let's just say there are many and it's about time. What I will say is that, like Sharepoint, D2 represents a progressive layer of understanding of how users use software to store, describe, and use content. It's another step up the ladder of awareness and better technology to get the tools into the hands of the users who need them.

Will D2 leapfrog Sharepoint as the slick configurator of content management UIs? I doubt it, but it will eventually help with migrations and maintaining consistency with platform upgrades, assuming that it is well tested. This is a big assumption. Speaking of migrations: there's a lot of work to be done moving WDK to D2. All of those customizations out there, all of the tireless hours of converting TBOs...

So the question is do existing customers leap to D2 or do they hold onto their WDK foundations until the last supported release is a year old? If the customer is an old timer (more than 5 years with DCTM) chances are good they will analyze the cost of migrating to D2 vs. another CMS. The leap may be too much to make. It might be cheaper to migrate to open source or another CMS, or DCTM 7.0 might be a must have. I'm hoping for the latter.

Sunday, February 12, 2012

Upgrade to More Simple and Other ECM Trends

This image is from

Churn = Change in ECM
In general, IT and business stakeholder churn influences the evolution of ECM systems. As business units become more savvy as to what can be done for free in the public domain (ie, Facebook), they demand the same for proprietary software. For example, Documentum burned through a number of costly search engine deals before settling on open source Lucene. On the flip side, the business is preferring Sharepoint for its ease of use.

CIO Revolving Door
In the middle is IT, contracting development of custom ECM tools and integrations, pushing for automation (without a clue as the intricacies of the business processes--usually), thus burning its bridges within their own companies. Then the CIO leaves or is fired. The new CIO "knows" how to deal with the issues (Laughing at the CIO) which changes the information architecture stack under the guise of innovation or an inflection point.

More Simple Configuration
Microsoft Sharepoint started the trend towards simplification by copying everything that was out there at the time and being smart about configuration. To counteract this trend, OpenText is hopelessly behind and Documentum bought D2 to leapfrog toward Sharepoint. This direction will lead to more emphasis on change management as companies realize how disorganized their content is. Like Sharepoint sites gone wild, enterprise content has been built business unit by business unit, even if it was centralized. Simple configuration will mean more consistent rules and policies around metadata and taxonomy.

Consistency = Lower Costs
Consistency will help all ECM systems simplify their compliance efforts. Records managers will have to streamline their complexity, lawyers will get better at mandating search and destroy rules in order to avoid potential lawsuits, and IT will take one more step back in terms of their influence in providing information technology direction.

Leaner IT
IT departments will basically become shells of what they were. Technical architects will be specialized based on industry. Outsourcing will turn into product development, not customization. Information Architecture will take on a new, more important role as the roles of the IT Director will diminish.

Secure Knowledge for $$$
This is a long shot, but we'll see: When a generic UI for information architecture and content is created--a Facebook for Content--more focus will be on security and monetizing a company's content and knowledge. Sharing information for bucks will happen, however the stealing of information, ideas, and software will have to be dealt with. As a publicly owned Facebook is required to grow its investor's money it will buy other content and knowledge from outside sources. As ECM system owners realize that the content they have and the analysis that they learn from it could be just as valuable as their core business, they will partner up with the likes of Google or Facebook for this new revenue generator. 

Saturday, February 4, 2012

DCTM Demos Deep Dive

We all know that demos are a great way to show potential clients some of the more unique features/functionality of the product or service you are trying to sell. The issue is to what extent do you know the audience's concerns? If the audience is a pharma, do you focus on compliance and retention policy or do you focus on converting their paper process to electronic forms? If the audience is finance, do you show them an AP solution or an integration with ERP? The bottom line is the related experience you bring to the demo. To really be able to talk about the pain points and show how the product solves them.

The days of a dog and pony show are over. Today's audience are made up of business savvy folks who not only want automation, they want someone who know's what the hell their talking about. If you don't have an analyst who is part of the demo, the demo will look like this:

Realize that your audience will most like sit through a few demos from other EMC partners, open source alternatives, competitors, etc.

Here is a list of key areas and questions to focus on when designing and developing a demo (or prototype):

Who is the audience? 
It used to be IT only, not it is a mixture of IT and business, so make sure the has plenty of meat on it.

Detail the Requirements
If the demo is for an Accounts Payable scan solution, make sure to focus on the inconsistencies of invoices, the exceptions that happen, why payments are held up, etc. Do not promote that everything can be done with configuration unless you are absolutely sure: if you make this mistake and get the contract, good luck extending it after blowing deadlines and coming in over budget.

Audience Push back
Be prepared to defend automation and new ways of looking at old software solutions. Most clients want to design a solution which mimics what their sneaker process, which includes design electronic forms that look exactly like their paper forms. Be prepared to explain advantages of perhaps splitting up the forms for security or signoff reasons, or for reasons of sustainability as the process changes over time.

This is my JOB!
You might be showing a demo to someone who sees it as obsoleting their job. They will pick the demo apart for its shortcomings. They will feel compelled to prove their worth with the knowledge. Take the opportunity to more fully understand their issues and undermining techniques. Be sure to respond with assurances that this will not take their job. If the demo is a workflow, the "big brother" aspect may enter into their minds. The point should be that automation frees up knowledge workers to think and innovate more...

Business folks love reports. They have the knowledge to interpret them and need to Powerpoint them to the higher ups.

Technical Architecture
Make sure you know how the demo can scale with number of users, the expected performance, levels of security and encryption. Again, if you promise what you don't really understand this will come back to bite you later if for example the encryption level that is required is not fully supported with the OS and software implemented.

Be sure to have someone who can tell battle stories and how this demo's solution succeeds. It helps to drink your own Koolaid, but how many EMC partners use Sharepoint? A lot. Why? because it is easy to setup. Why aren't all EMC partners required to only use EMC products? Good question, try creating a demo of a break/fix system like Jira and you'll know what I mean, but seriously, if you took the time to use the products that you design solutions for you'd feel more of the pain that your audience goes through.

Saturday, January 28, 2012

"With Liberty and Justice for Some" applied to ECM

Glen Greenwald's book, "With Liberty and Justice for Some" can be applied to the world of enterprise information management. In my 20 years of watching content management projects gain political momentum inside companies, be rammed through the process usually ignoring the requirements of key groups, finish with fanfare and pomp, and then IT moves onto the next big project. This is not new, however Greenwald's point that we need to resolve the issues of the past in order to move on fairly and responsibly is crucial.

No one wants to look back on an ECM project's successes and failures, especially given the fast pace of software technologies. When I'm done with a project, the core software has been patched and a major upgrade is approaching fast. If the teams involved with designing, developing, deploying, and testing the project could sit down and work through the issues (because there were no doubt issues: always are) retrospectively the whole company is that much more richer in understanding and agile in their pursuits.

I know, this is called a "post mortem". All companies do this or at least try halfheartedly. What I'm really getting at is true analysis of who the bullies are in the process, which groups got shafted, who broke the rules for personal reasons, etc. I concerned about the characters involved; what they did that was constructive and what they did that obstructed the process. In most companies a "post mortem" is done by the IT team, maybe including a few doers from the business side. Management does not get that involved, unless they want to fire someone or control the situation (I know pessimistic) .

If management understood the underlying negative impact that some of the projects and software used impose on their workers, they would demand closer scrutiny of every project. Instead, management seems to be more keen on the next big thing, leaving the workers to scrap together new ways to design and develop the mess they were just handed on their own. I say mess because information projects never really end; they have a "long tail". Some information management projects eventually fail because they were not nurtured and fed the right amount of emphasis and attention that they rightfully deserve. Just as Greenwald says that the elite have impunity with the rule of law, so to companies and managers as whole need to take their share of the blame, retrospectively, when an information management project fails.

Friday, January 13, 2012

Demos: Not as easy as it sounds

Creating a plain vanilla software demo is easy. You follow the tutorial if you haven't used the software before and create a simple solution to show a potential client. That worked years ago. Now demos are supposed to be multi-tiered catering to new users and experienced users alike.

What is usually missing during demos is a focused solution which solves the business problems that have brought the potential client to see what the demo is all about. They have most likely seen the same vanilla demo over and over again if they are being pitch an xCP solution.

What the client is looking for is your understanding of their problems, that you've been there. Some can see the potential of the demo to solve their problems. Others will want a business analyst to give the details of how the requirements of the business will be matched with the functionality of the software.

The problem is that every business is unique and are seeking software to fix issues with their processes or content management or collaboration. They have gone to the trade shows, they have read the books, they want to move up in their companies. Now you have to somehow convince them that the software can get them their promotion, or at least make them look good.

If you have a fast talking sales guy in the room make sure you get a feel quickly if the clients are comfortable with his energy and fluff. If not, have the demo talk. If that doesn't work, bring out the business analyst who has been in the trenches. If they chew him up, walk away.