Tag: Business Model Design

Organizational Design from the Industrial Age to the Digital Age

As the Information Age continues to evolve at a rapid rate, businesses are finding it challenging to compete in their marketplace.  New technologies, global markets, demanding customers, disengaged employees and aggressive competitors are placing unprecedented pressure on many companies, who must now rethink the way they do business.  In order to be successful within the current economy, businesses must ensure that the collective efforts of their workforce are aligned with their strategies, goals and principles.  More importantly, companies must transform their business into a Digital Firm (Industrial Economy → Digital Economy).  Where business processes, information, technology and relationships with customers, suppliers and employees are digitally enabled and key corporate assets are managed through digital means.

4 Fundamental Elements for Organization Design

  • Empowers the capabilities of your people.
  • Enhance collaboration, communication and management amongst workforce.
  • Develop efficient digital business systems and programs.
  • Design an effective, agile and sustainable organizational structure.
  • Establish a supportive work culture.

Organization Design transforms complex enterprise ecosystems into flexible, agile and sustainable organizations that can respond and pivot to external and internal developments and achieve success.

Should Business Strategy be Influenced by Technological Considerations?

Can business strategy be created in isolation of the technology considerations? There is a widespread belief in the Business Community that Business Strategy comes first and then technology follows in some way to support that business.

In my experience the common perception among organizations is that Business defines its strategy first and then technology enables the strategy.

Strategy Development Process:

In order to explore the role technology plays in shaping and supporting the business, let’s look at how strategies are developed.  There has been a significant amount of research done and published in understanding how strategies are developed.  Here are some relevant highlights.

There are two main dimensions to strategy development.

  1. Visionary thinking based on intuition, a sense, an ability to make bold predictions and define goals.
  2. Strategy development is largely based on scientific analysis, considering options and recommendations based on the analysis followed by implementation.
    • Strategic Analysis guided by scientific approach understanding your markets, competitors, value chain, bargaining power of the key stakeholders.  It also entails understanding the strengths and weaknesses of your organization against the opportunities and threat that the external environment presents
    • Strategy Formulation guided by analytical findings, alignment to the vision and overall goals of the organization to create a strategic road-map
    • Strategy Implementation is of course converting the strategy to real results by successfully implementing the strategy

It is the strategy development that is the focus of this article. Specifically, strategic analysis which then guides the strategy formulation and implementation.

Is there a place for technological consideration in strategic analysis? The answer is quite apparent as demonstrated through examples next.

Technological Influences on the Business Landscape

Examples of technologies that have had transformation impact on business value chain and have redefined markets and distribution channels are all around us.

The globalization phenomenon enabled by the internet is one of most profound. The Internet has impacted all the traditional dimensions of business strategy (reduction in barriers to entry, increased market size across the globe without limitations of geographic divide, increased competition etc.).

Financial services industry is a prime example of an industry where technology has transformed the value chain, redefined competitive forces and given the consumers tremendous amount of bargaining power.  Entry barrier have been declining, new competitor have emerged. Some financial products and services have become more transparent and commodities making the market more competitive. Internet as a tool to create a new service delivery channel (reduced channel costs, 24 by7 availability) has put pressure on the more traditional branch based channels. The resulting service delivery cost structure has changed. ING is operating on the model that bricks and mortar are not required to sell its banking products and services.

Healthcare value chain has been transformed by technological advances, linking healthcare records through electronic information exchange, diagnostic imaging from traditional film based to digital imaging has redefined the value chain and changed the balance of power between the suppliers, buyers not to mention the very nature of the products and services being delivered.

Retail Industry is another such example where technology has changed the business landscape.  Amazon’s strategic business model was completely defined by technology.

Relationship between Business and Technology

Given how profoundly technology has influenced our business and personal lives, it is hard to fathom how a successful business strategy can be defined without considering technological influences and enablers.  By creating a partnership between Business and Technology at the Strategy development stage, you are creating a strategy that is well formed and can maximize business value and competitive positioning by embedding technological considerations from the very start (and not an after thought!).

So why is it that there is a significant divide between the Business and Technology?  In subsequent articles, I will focus on why there is this barrier (real or perceived) that creates this divide between Business and Technology.

If you have examples to demonstrate the benefits of business/technology partnerships, please share your thoughts on this forum.

The Implicit Value of Content is Realized Through Business Process

As I have noted before, much of the historic discussion in the document management field has concerned the cost of producing content, or the cost of finding existing content.But the value of a document, or any other piece of content, is seldom the same as its cost of production.I was chatting about this the other day with my colleague James Latham. He used an invoice as an example of a piece of content that may be managed by an enterprise content management (ECM) system. James noted that, ‘There is inherent or explicit value in an invoice’. In fact the value of an invoice is fairly tightly linked to the cash it represents.A $10 bill has an explicit value of $10. Likewise a delivered invoice for $10 has a value of about $10 to an organization. Arguably it is not quite as valuable as $10 cash given the delay and perhaps uncertainty of payment, but it is close enough in most cases and will be treated as such in an accounting system.There is a case where a $10 bill is worth much more: if it is a rare, old $10 bill, it may have a lot of implicit value (e.g. to collectors it may be worth hundreds of dollars) above its explicit value of $10.Tangible value (explicit plus implicit) is established by sale of the item itself or the recent valuations of comparable items. But it is hard to think of invoices, especially electronic invoices (i.e. digital content), as having any implicit value.Are there other kinds of enterprise content besides invoices that clearly have implicit value? I think so. Here’s a good example: documents that support a patent application for a product with large market potential may have huge implicit value that greatly exceeds their cost of production and their explicit value at a given moment. This implicit value may become more explicit over time with the issue of a patent, together with product and market advances. At some point an intellectual property sale could attribute very significant tangible value to the documentation.In this patent documentation example, the application of process over time helps to create tangible value. In ECM discussions we often speak of the context of content as helping to give it meaning, but clearly we also need to consider how process can give it value.

Enterprise Content Architecture – my take on the Metastorm acquisition

I’m particularly excited by today’s announcement acquisition of Metastorm by OpenText, but not perhaps for the same reasons as many others.What excites me is the potential of Metastorm’s strengths in Enterprise Architecture (EA) and Business Process Analysis (BPA). As noted in the release:“Metastorm is a leader in both BPA and EA as recognized by Gartner in the Gartner Magic Quadrant for Business Process Analysis Tools, published February 22, 2010 and the Gartner Magic Quadrant for Enterprise Architecture Tools, published October 28, 2010.”These capabilities play to both the ‘Enterprise’ and ‘Content’ in Enterprise Content Management (ECM).Organizations depend on a growing proportion of knowledge workers as I discussed in a previous post (Value for Knowledge Workers), but as noted in the McKinsey study I covered (Boosting the productivity of knowledge workers),  most organizations do not understand how to boost the productivity of knowledge workers or indeed the barriers to that productivity. As I noted:“What struck me in reading the article is that while an increasing proportion of staff in companies are knowledge workers, it is not clear what knowledge work is and how to best enable it to drive productivity gains. Given that, it is hardly surprising that people struggle to define the value of those software tools best able to support knowledge management.”Content is the currency of knowledge work. It supports the exchange of knowledge during business processes, and is very often the work product of such processes (e.g. a market analysis report, an engineering drawing or a website page). Too often in the past discussion of the value of content has centered on either reducing the unit cost of producing, finding or using content, or mitigating compliance risks created by poor content management.This is not a new theme for me, indeed last August I expressed my enthusiasm for why Content Matters. I noted:“It’s no surprise to people that you can understand a business by ‘following the money’ or ‘following the customer’ and that is the basis for ERP and CRM systems. On the other hand most people are only just coming to realize that ‘following the content’ is just as important, so while we’ve talked about content management for many years, that conversation is starting to be important to business.”The potential to apply Metastorm’s ProVision tool set to elucidate and illustrate the critical role of Content to the achievement of Enterprise Goals is an exciting one which offers new value to our customers.

Content Management Systems as Cities – I feel like a Mayor!

I recently realized that large enterprise content management (ECM) systems are like a city, but most ECM practices treat them as if they were a building. There’s a big difference in complexity that impacts the operation of an ECM system.Architects can design a building to suit its intended purpose and building management can maintain it. In the same manner an ECM expert can design a system to manage digital content in support of particular business processes. Much of the ECM literature talks of the benefits of clear system architecture and good governance.As an ECM system is deployed across an organization the breadth and number of applications grows rapidly – often into the hundreds – with many different business sponsors and champions! It becomes increasingly hard for any one person to understand all of the different ways that a system is being used, and to exert any effective control. The flexibility accorded users through collaborative, social tools further increases the heterogeneity of an ECM system.Not all ECM application deployments meet with equal success or longevity. In many ways the applications in an ECM system resemble buildings in a city – different sizes, different ages, different investments and different degrees of success. Some buildings are abandoned and some never get off the drawing board!No one designs cities – they are just too complex. Sure there are examples of attempts to do this – the initial design of Brasilia or the redesign of the center of Paris by Haussmann – but over time the efforts and activities of many other people determine how a city develops. In fact cities are very much an expression of human behaviour, culture and society.Overall city management falls to the Mayor and City Council, and their most important tools are Building Regulations and Permits, Ordnances, etc. While you can’t and shouldn’t control everything in a city, you can nevertheless provide some direction and minimal standards. The architects of the many buildings need to get approval for their plans before a building is constructed, and the building operators need to comply with other standards.When ECM was a new concept, the focus was on how to best design and operate a first application for the new system – a new ‘building’ standing in a ‘green field’ if you will. As ECM matures we need to think about how to operate large, multi-application systems. For me a better role analogy for the person with overall system responsibility is Mayor, not Architect. It’s not that we don’t need ECM Architects – in fact we need many of them – but we also need a Mayor and Council to provide a framework for oversight and long-term strategy. And we have to accept at least a degree of disorder that results from the activities of many different people that are only loosely coordinated – Mayors are necessarily politicians, unlike Architects!

Value for Knowledge Workers

Demonstrable value goes a long way to supporting the deployment of new software tools.

For structured business processes, return on investment (r.o.i.) is comparatively easy to estimate. Where unstructured or semi-structured digital content items (e.g. documents, spreadsheets, faxes, etc.) enable a given structured process (e.g. accounts receivable) their contribution to the overall value created is also typically quantifiable.

Where the process itself is unstructured the measurement of value is much harder. Perhaps the largest class of unstructured processes in a company fall in the category of knowledge work. The difficulties organizations have in understanding knowledge work is highlighted in an article just published in the McKinsey Quarterly entitled: “Boosting the productivity of knowledge workers”.

  • Aside: Unfortunately a subscription is required to read the full article – hopefully you have one.

The article starts with the proposition that few senior executives can answer the question: “Are you doing all that you can to enhance the productivity of your knowledge workers?” This is unfortunate because, “Organizations around the world struggle to crack the code for improving the effectiveness of managers, salespeople, scientists, and others whose jobs consist primarily of interactions—with other employees, customers, and suppliers—and complex decision making based on knowledge and judgment.”

The authors, Eric Matson and Laurence Prusak, describe five common barriers that hinder knowledge workers in more than half of the interactions in surveyed companies:

  1. Physical
  2. Technical
  3. Social or Cultural
  4. Contextual, and
  5. Temporal

Physical barriers include geographic and time zone separation between workers, and are typically linked to Technical challenges – where workers lack the necessary tools to overcome the physical barriers that separate them. As the article notes, there are a many software tools available that can help – these would include the various collaborative and social media tools, as well as the more classic document management applications that are encompassed in the broadest definitions of Enterprise Content Management (ECM).

Of course the availability of software tools does not guarantee that users will use them effectively; indeed, Social (e.g. organizational restrictions, opposing incentives and motivations) and Contextual barriers (e.g. not knowing who to consult or to trust) play a large part in hindering adoption.

The fifth barrier is Temporal. Time, or rather the perceived lack of it, is also a critical factor. In my experience knowledge workers do not consider time spent using social media and collaborative tools as important as other activities. Under time pressure they will stop using these tools if they need to spend more time on other activities they perceive as “real work”.

What struck me in reading the article is that while an increasing proportion of staff in companies are knowledge workers, it is clear that what knowledge work is and how to best enable it to drive productivity gains is not clear. Given that, it is hardly surprising that people struggle to define the value of those software tools best able to support knowledge management.

Job One in the upgrade of a major ECM system

“Upgrade it and they will run away!” is a risk scenario with any major upgrade of a business-critical, enterprise system, including an enterprise content management (ECM) system.

Often the people promoting an upgrade are technologists who are almost always ‘early adopters’, but many staff just want to get their job done and will often be confused by, resent or even resist changes – telling typical users that they will get a whole bunch of ‘cool, new features’ isn’t likely to make them enthusiasts.

Here’s a typical persona of such a user:

  • Doesn’t read corporate communications (newsletters, emails, etc.)
  • Doesn’t like technology
  • Couldn’t care less about the product or site provided it ‘works’
  • Just wants to ‘do their job’ without external disruption

One of the big challenges is to ensure that when such a persona comes to work on the Monday after a major upgrade that they don’t say, “What the *#% happened to the site,” especially when the interface has changed.

I’m struggling with these issues in advance of a major ECM system upgrade. The system is called Ollie and has been in production for 15 years. It now has over 5.5 million objects and 4,000 users – 93% of whom use the system every month. It’s actually the main internal Enterprise Library of Open Text and is pretty much an un-customized version of the product we sell now called Content Server.

  • Content Server version 10 is just about to be released. It is the latest iteration of a product first called Livelink, and provides the underlying shared services of the Open Text ECM Suite.

Without doubt the newer version provides a better, more modern interface that will be preferred by most users – once they learn what’s different and how to use it. I know most users will prefer it as it has undergone extensive usability testing – but I also know that you can’t please all of the people all of the time and most people don’t like surprises at work.

So ‘job one’ is to create a short, effective video that overcomes the shock of the unexpected, since no matter how good our communications strategy is, many people will be surprised. The video also has to smooth the way for further change, because while some of the benefits of the new version will be available on Day One, others depend on subsequent work by knowledge managers using new capabilities that become available after the upgrade.

The Myth of Real-time Collaborative Authoring

In the document management field there has been a succession of products designed to support users working on a document at the same time, even if they are in different locations. These products have failed. They have failed because people don’t work on documents together very often.

I wonder where the belief in concurrent creation of documents came from. In the physical world you seldom see people saying, “Come to my office and we’ll write a document together,” so why expect users to want to do it virtually?

Documents may well be created to summarize a brainstorming session or record the minutes of a general meeting, but the designated author usually ‘goes away’ to somewhere quiet to write the first draft.

Even in the review phase, reviewers independently make comments, suggestions and edits at different times. The author then pulls these together to make a revised version. Email is no different, especially since emails of any length are essentially documents.

Sure the stepwise, asynchronous approach to content authoring and review takes place over a longer period, but it actually makes best use of each participant’s time, and is therefore more efficient overall.

I started to think about this again with yesterday’s announcement that Google Wave will not be further developed (http://googleblog.blogspot.com/2010/08/update-on-google-wave.html). As the blog post says, Google Wave was, “…a web app for real time communication and collaboration”. For the purposes of this discussion let’s consider both collaboration and communication independently.

Collaboration in Authoring

A technical tour-de-force, Wave enabled users to see others changing content as they themselves changed it. Very cool, but actually disconcerting. I wouldn’t have wanted you to have watched me author this blog post, for several reasons:

  • I’m easily distracted and need to concentrate to develop some cohesive thoughts
  • While writing I jump around adding sections, changing others, moving text blocks – it would be hard to follow and I’d have to explain what I was doing which would further slow me down and distract me
  • I’m the World’s worst typist

I’m probably no different than most people, at least regarding the first two points. And perhaps more lethal to the concept of concurrent authoring:

  • You’d get bored – it takes far longer to author a document than read it, and you’d probably want to be doing something else while I work, preferring to comment on my finished work

And that’s the crux of the matter – most people are busy, with many demands on their time, and collaborative authoring is just too inefficient.

Communication Delays are Good

While Wave was designed for collaboration, it was also intended for communication (see quotation above). Essentially email and instant messaging rolled into one. But I think there is a problem there too – most people actually don’t want to use real-time communication!!

Many commentators have remarked on the tendency for young people to use their mobile phones for text messaging far more than as telephones. You’d think it would be easier to engage in a conversation by talk rather than typing, so why is texting preferred?

I think people prefer texting because it allows them to be engaged in many, independent conversations with different people. For this to work they need to be able to send and receive messages in real time, but also need an agreed expectation that replies may take several minutes. Awkward silences of several minutes on a phone aren’t agreeable, and since voice isn’t cached locally like a text message you have to listen to each voice channel concurrently – which isn’t practical.

Interestingly while they are short, both mobile text messages and instant messages (IM) are generally only sent when they are completed. It is usually enough to see that the recipient is typing (i.e. with Instant Messaging) or to just assume that they are (i.e. with texting). All stumbles, pauses, and corrections are not sent – but they were with Google Wave.

Summary

With small pieces of content: true real-time communication is often undesirable, with near real-time being better.

With larger pieces of content: collaborative authoring is best done asynchronously.

Collaborative authoring seems to be something that IT professionals believe will lead to greater efficiencies, while end users don’t have the time for it!

Taking the Pulse of your Business Content with microblogging

When many users first encounter microblogging they don’t ‘get it’. Twitter is of course the classic and most widely known microblogging site, and its style has been taken up by many others such as Facebook in a broader set of social media approaches. A common initial reaction is something to the effect: “I don’t care if your cat just threw up – in fact, I’d rather NOT know!!”Once people start to microblog, they find many different ways that it can provide value, beyond answering the question: What’s happening? that twitter poses. Commentators have described endless ways of using twitter such as: 5 marketing approaches, 10 diverse applications, 50 different topics, etc.But how does microblogging add value within an organization? Most of the discussions about business value have been on better ways to reach outside an organization to customers and partners by breaking down barriers, increasing transparency and the like.At first blush making the case for microblogging in the workplace might seem to be hard. People often comment that they are too busy to engage in ‘chit-chat’ while at work. But over the last couple of years the use cases that have real business value have become clearer.For me there are two general styles of internal business microblogging:

  1. User status updates – close to the twitter model, but with a distinctly different topic set
  2. Content status updates – fairly unique to business and keyed to the fact that many work processes produce and manage content (i.e. documents and other business files as understood in content management)

At Open Text we recently released the Pulse module for Livelink 9.7.1 that adds microblogging capabilities to support both styles (available for free to customers from the Knowledge Center).Status updates are pretty much what you’d expect – you can make a post about anything, although some of the most useful ones are:

  • “I’m looking for…”
  • “Anyone interested in…”
  • “Have we…”

These have value because they help people to be more effective through better networking in an organization.You can select specific users to follow and you can follow the stream from all users. We have a very similar Pulse capability in Open Text Social Workplace.BUT, I think the real advance in Livelink/Content Server Pulse is to follow the status of content irrespective of location in a range of very powerful and comprehensive ways.Sure you can post a link to content in twitter, and many microblogging services allow you to attach documents or other kinds of files to your posts. But the advance here is to have the act of adding or changing a piece of content anywhere in an ECM system create a status post. The feed is reporting a content action by another person. If I’m following Joe and he adds a new sales presentation anywhere I can see it in the status stream – provided of course I have permission in the repository to see the added content. All of the important support for compliance is maintained.There are many ways to slice-and-dice: by following specific people or all people, and following changes in user status, content or both.You can also ‘pulse’ specific content objects, so all changes and all comments about a piece of content are seen in the unique Pulse stream of that object. It’s like a filtered window into the stream looking at just one object, even if the ECM system contains millions of documents.And Pulsing is not just limited to files/documents, but is applied to containers like folders and places such as project workspaces and communities. You can imagine the power of an accumulated stream of all content and status activity related to a project!Livelink has had a notification capability for many years, but it requires users to first identify existing documents and containers that they would like to follow. Pulse adds the human dimension – you can be notified of changes based on the people you follow and what they do with the content.To honest I’m still ‘figuring out’ all of the ramifications and power of Livelink/Content Server Pulse but I’m very excited!  If you’d like to learn more:

  • Initial description in the May 2010 issue of NewsLink
  • Free Webinar Thursday 3 June 2010
  • Software and documentation in the Knowledge Center
  • And if you are an Open Text Online Communities member you’ll be able to use Pulse very shortly (announcement)

Syndicated at http://conversations.opentext.com/

Predicting Sentiment in Advance

There are now a number of tools that monitor social networks looking at:

  • Sentiment analysis – General sentiments related organizations and their brands
  • Topic trend analysis – the relative frequency that topics are mentioned over time

Therefore, if I make blog post or tweet, my topic and sentiments will be captured by automated systems, analyzed and reported. There are some pretty sophisticated tools being used by Marketers, and while some are free, others are quite expensive. However, as a recent blog post noted: “marketing measurement technology really is.’ href=’http://trenchwars.wordpress.com/2010/02/28/a-technology-glitch-demonstrates-how-fragile-marketing-measurement-technology-really-is/’>A technology glitch demonstrates how fragile marketing measurement technology really is.” That said, let’s assume they’ll get better or this Technorati glitch was atypical.

I can also manually get some trend information using Google trends for example looking at ‘ECM’ over time: http://trends.google.com/trends?q=%22ecm%22&ctab=0&geo=all&date=all&sort=0. However, the information on ECM is too sparse, and there is too much ‘contamination’ with other definitions of ECM, such as Engine Control Management.

But I have to admit I don’t use these tools. So I need a tool like all great advances that caters to laziness – or increased efficiency as I might prefer to characterize it! I need help. I need push rather than pull technology. It occurs to me that I wouldn’t mind knowing how my proposed post relates to other posts already made. I’d get a report something like:

“Your post on ‘ECM’ would be the 47th post on this topic so far this year. This topic is declining in frequency.”

“Your ‘negative’ post on ‘content system metadata’ would align with 19% negative, 25% neutral and 63% positive posts on this topic.”

Besides putting my proposed post in context, I wouldn’t mind getting a sample of the most relevant posts so that I could potentially revise my post, add links, references, rebuttals, etc.

At some level this would be a form of assisted authoring. It wouldn’t have to be limited to blog posts. I’d like to do the same for content authored in an enterprise context. The reality that many reports, white papers and similar work products of knowledge workers duplicate things already available, but people generally don’t look. It’s easier to start typing, imagining your work to be original, than to look first if it’s already been done or if there is something close than you can build on.

Syndicated at http://conversations.opentext.com/

Google’s impact on enterprise content management

Without a doubt Google has had a huge impact on the enterprise perspective on content management (ECM).

The pluses and negatives were highlighted by two blog posts yesterday:

On the plus side, John Mancini of AIIM listed three, “fundamental assumptions about information management that affect the ECM industry,” in his “Googlization of Content” post:

  1. Ease of use. The simple search box has become the central metaphor for how difficult we think it ought to be to find information, regardless of whether we are in the consumer world or behind the firewall. This has changed the expectations of how we expect ECM solutions to work and how difficult they are to learn.
  2. Most everything they do is free…
  3. They have changed how we think about the “cloud.” Google has changed the nature of how we think about applications and how we think about where we store the information created by those applications. Sure, there are all sorts of security and retention and reliability issues to consider…”

On the negative side, Alan Pelz-Sharpe made a post today in CMS Watch titled, “Google – unsuitable for the enterprise”. Alan introduced his piece by saying:

For years now Google has played fast and loose with information confidentiality and privacy issues. As if further proof were needed, the PR disaster that is Buzz should be enough to firmly conclude that Google is not suitable for enterprise use-cases.” He went on to say, “It is inconceivable that enterprise-focused vendors… would ever contemplate the reckless move that Google undertook in deliberately exposing customers’ private information to all and sundry with Buzz.”

Google is a hugely successful company, and they are extremely profitable. However, they are not a software company. Fundamentally they are an advertising placement company and everything they do is motivated by maximizing advertising revenue, whether directly or indirectly. 99% of their revenue comes from advertising that pays for every cool project they do and every service they offer.

While Google services to consumers have no monetary charge, they are not free:

  • You agree to accept the presentation of advertisements when you use Google products and services; most people believe these to be easily ignored despite the evidence of their effectiveness.
  • More importantly, you agree to offer provide information about your interests, friends, browsing and search habits as payment-in-kind. Mostly people sort of know this, but don’t think about it. If you ask them whether they are concerned that Google has a record of every search they have ever performed, they start to get uncomfortable. I expect most of us have searched on terms, which taken out of context, would take a lot to ‘explain.’

While most consumers in democracies are currently cavalier about issues of their own privacy, enterprises most certainly are not. Indeed, the need for careful management of intellectual property, agreements, revenue analyses and a host of other enterprise activities captured in content is precisely why they buy ECM systems.

The furor over Buzz points out that Google did things first and foremost to further its own corporate goals, which clash with those of other enterprises.

In contrast, Google’s goals require it to align with user needs, especially for good interfaces. An easy-to-use interface encourages and sustains use. That ought to be obvious to everyone, but when the effects of the interface on usage are easily measureable and directly tied to revenue (as in the case of Google Search), it becomes blatantly and immediately evident. In contrast, the development of an interface for an enterprise software product may take place months or even years before the product is released. Even if detailed usability research is done with test users, and in-depth beta programs are employed, the quality and immediacy of the feedback is less.

Besides easy interfaces, enterprise content management users expect ‘Google-like’ search, and are disappointed. There are generally two reasons for this:

  • Search results have to be further processed to determine if a user can be presented with each ‘hit’ based on their permissions
    • Typically 70-90% of the total computational time for enterprise search is taken up by permission checking
  • Enterprises don’t invest as much in search infrastructure as they should if the rapid delivery of search results was seen as critical

The second point is probably more important than people admit. In my experience significant computational resources are not allocated to Search by IT departments. I suspect that they look at average resource utilization, not peak performance and the time to deliver results to users. To deliver the typical half second or less response that Google considers to be essential, hundreds of servers may be involved. I am not aware of any Enterprise that allocates even the same order of magnitude of resources to content searching, so inevitably users experience dramatically slower response times.

In summary, the alignment of optimal user experiences with Google’s need to place advertisements has advanced the standards of user interfaces and provided many ‘free’ services, but the clash of Google’s corporate goals with the goals of other corporations has shown that the enterprise content has value that is not likely to be traded.

Syndicated at http://conversations.opentext.com/

Really looking forward to Virtual Content World – other ways to be ‘virtual’

You’ve probably heard about the first Open Text Virtual Content World (www.opentext.com/virtualcw) on Tuesday 19 January 2010. Hopefully you can attend. I’ll certainly be there in a virtual sense. It’s not too late to register, and if you attended Content World 2008 you’ll have received a code promotional code for free registration.

For those who can’t attend, there is an even ‘more virtual’ and dare I say free option – watch the many postings on twitter and Facebook.

The twitter hashtag is #otvcw.

The volume of tweets will really pickup on Tuesday if the Content World 2009 experience is any guide, not just from the ‘official’ event twitter account (@OTContentWorld) but of course from other OT staff like me, and most importantly, customers.

It should be a great event. There as certainly been a lot of organizational activity. Colleagues have told me this virtual event has been as much work as an in-person one.

‘See’ you there!

Twitter: @MartinSS

Syndicated at http://conversations.opentext.com/

Customer Community Success Metrics for 2009

Customer communities are all the rage nowadays, but it is not always clear what works and indeed how to measure success.As 2009 draws to a close we have been reviewing how Open Text customer communities have been doing.Background: For those not familiar with Open Text, we are a vendor of enterprise-class software to manage digital files (called content). The term enterprise indicates that we sell to organizations not consumers. We have relatively few customer organizations, but they are often typically some of the biggest organizations in business and government. We estimate that at least 1 in 3 Internet users visit sites that run our software! The software we use for our own communities is the same as we sell. The SitesFor historic reasons, we run three primary community sites (requiring membership) in addition to our typical corporate websites. The community sites are:

  • Open Text Knowledge Centre (KC)
    • Primarily for system administrators of the software we sell
  • Open Text Developer Network (OTDN) which is housed on the KC
    • Primarily for developers using Open Text APIs
  • Open Text Online Communities
    • Primarily for business champions and power users

Site Metrics

  1. The Knowledge Centre is by far the oldest community, dating back to 1996! As you’d expect, it has the most members and the most ongoing activity. Every day approximately 4,000 users access the site, and between 150,000-200,000 documents downloads are performed every month!
  2. OTDN just completed its first full year during which just over 3,200 unique users participated over the past year
  3. Online Communities got started in its present form in 2005. This last year 10,600 members collectively visited 118,000 times over the year

These numbers only measure direct participation. As you might expect, many community members participate through email-mediated discussions.ConvergenceMultiple systems have traditionally meant that there are multiple, disconnected silos of information. As a result, users don’t know where to look and administrators have to duplicate critical content between systems.A better approach is to deploy a single, ‘enterprise library’ of digital files (content) which contains all of the files, but just one active copy of each. The three sites above will soon converge to use the same enterprise library, which will also be used by our corporate website that is open to the general public.One single repository can make user navigation harder unless the most relevant content is presented and organized in a fashion that best meets the needs of each type of user (i.e. persona). Communities of users with similar interests or jobs are one approach to organizing content, but of course there are others, including personalization based on the activities and preferences of specific users.Measuring 2010 successThese communities will continue to develop, but the latest social networking approaches provide new ways to surface important content. As we deploy more social networking approaches during 2010 we’ll have a solid base of community metrics from 2009 to judge progress. As you might expect, activities on external sites like twitter, YouTube and facebook are becoming increasingly important.Syndicated at http://conversations.opentext.com/

Some Thoughts on Effective Enterprise Taxonomies for Content

While discussions of content management are usually referred to in the context of the needs of enterprises (i.e. Enterprise Content Management, ECM) in reality most ECM deployments start at the departmental level. This is not bad – on the contrary departmental deployments typically address specific business needs thereby sharpening their focus to improve their chances of success. However, over time as an enterprise begins to deploy ECM technologies in many departments, the benefits of an enterprise strategy to support cross-enterprise deployment become apparent.