Tag: Education

Organizational Design from the Industrial Age to the Digital Age

As the Information Age continues to evolve at a rapid rate, businesses are finding it challenging to compete in their marketplace.  New technologies, global markets, demanding customers, disengaged employees and aggressive competitors are placing unprecedented pressure on many companies, who must now rethink the way they do business.  In order to be successful within the current economy, businesses must ensure that the collective efforts of their workforce are aligned with their strategies, goals and principles.  More importantly, companies must transform their business into a Digital Firm (Industrial Economy → Digital Economy).  Where business processes, information, technology and relationships with customers, suppliers and employees are digitally enabled and key corporate assets are managed through digital means.

4 Fundamental Elements for Organization Design

  • Empowers the capabilities of your people.
  • Enhance collaboration, communication and management amongst workforce.
  • Develop efficient digital business systems and programs.
  • Design an effective, agile and sustainable organizational structure.
  • Establish a supportive work culture.

Organization Design transforms complex enterprise ecosystems into flexible, agile and sustainable organizations that can respond and pivot to external and internal developments and achieve success.

What’s the difference between Business Continuity (BC) and Disaster Recovery (DR)?

What’s the difference between Business Continuity (BC) and Disaster Recovery (DR)? This is a question I have had to answer multiple times. It is a very good question and the answer is not simple! So, as a good lazy ‘techy’, I tried to find the answer on the web. That way, when I am asked, all I would have to do is send a link.

I have used this approach multiple times for other questions I have received. It is convenient and a great way to avoid re-typing an answer. However, this time, I was not very successful in my quest to find an answer. I searched the web, multiple times, for hours without finding the perfect “pre-written answer” I was looking for. So I decided to stop being lazy and write it myself.

Now, if you are like me, and you’ve been looking for an answer to this question, feel free to use this one.

So, let’s start with a few definitions from the Business Continuity Institute (BCI) Glossary:

Disaster Recovery (DR): “The strategies and plans for recovering and restoring the organizations technological infrastructure and capabilities after a serious interruption. Editor’s Note: DR is now normally only used in reference to an organization’s IT and telecommunications recovery.

Business Continuity (BC): “The strategic and tactical capability of the organization to plan for and respond to incidents and business disruptions in order to continue business operations at an acceptable predefined level.”

First, I’d like to say that I have a slightly different view of DR than BCI. Now, who am I to disagree with what BCI is saying? Well, bear with me a little longer and you will see how my interpretation of DR might help people understand the differences between DR and BC better. So here’s my definition:DR is the strategies and plans for recovering and restoring the organizations (scratch technological) infrastructures and capabilities after an interruption (regardless of the severity).

Unlike the BCI, I don’t make a distinction between the technological infrastructure and the rest of the infrastructures (the buildings for example) and nor I do differentiate between the types of interruptions. In my opinion, either a system is down or a building is burnt or flooded, both should be considered a disaster and therefore both require a disaster recovery plan.

Therefore DR is the action of fixing a failing, degraded or completely damaged infrastructure. For example, the 2nd floor of a building was on fire; the fire is now out so the initial crisis is over. Now the damage caused by fire must be dealt with; there is water and smoke on the 2nd floor, the 3rd floor has damages caused by smoke and the 1st floor has water damage. The cleanup, replacement of furniture, repair of the building and its structure, painting, plastering, etc. are all part of the disaster recovery plan.

What is Business Continuity then? Business Continuity is how you continue to maintain critical business functions during that crisis. Back to the example, when the fire started, the alarm went off and people were evacuated from the building. Let say you had a Call Center on the 2nd floor and this just happens to be a critical area of your business. How would you continue to answer calls while people are being evacuated? How would you answer calls while the building is being inspected, repaired or rebuilt? Keeping the business running during this time is what I call Business Continuity.

The same approach can be taken with a system crash or when the performance of a system has degraded to the point that it has impacted business operations. So fixing the system is DR and the action of keeping the business operations running without the system being available is BC.

In conclusion, BC is all about being proactive and sustaining critical business functions whatever it takes whereas DR is the process of dealing with the aftermath and ensuring the infrastructure (system, building, etc.) is restored to the pre-interruption state.

Cloud Computing Defined

Welcome to the first installment of what will be an on-going series on Cloud Computing.  Everyone in the industry is talking about it and the media is awash with hype.  I’ll be taking a different approach by trying to bring some clarity and reason to help you make informed decisions in this area.

The place to start is with a definition of the term.  There are a wide variety of sources that attempt to define Cloud Computing, many with subtly different nuances, and often including benefits (e.g. flexibility, agility) that are potential outcomes from going Cloud, but certainly don’t belong as part of its definition.

I prefer the U. S. Department of Commerce National Institute of Standards and Technology (NIST) draft definition which is well-considered, well-written and carries the weight of a standards body behind it.  Their definition sets out five essential characteristics:

  1. On-demand self-service: This means that a service consumer can add or delete computing resources without needing someone working for the service provider to take some action to enable it.
  2. Broad network access: Note that the NIST definition does not specify that the network is the Internet (as some other definitions do).  This is necessary to allow for private clouds.  NIST goes on to say that cloud services use standard mechanisms to promote use by a wide variety of client devices.  It’s not clear to me that this should be a full-fledged requirement, but is certainly in keeping with the spirit of the cloud concept.  Once could imagine a cloud that uses custom protocols agreed to by a closed group of consumers, but perhaps the word “standard” still applies in that it would be a standard across the consumer group.
  3. Resource pooling: Also known as multi-tenancy, this characteristic requires that the cloud serve multiple consumers, and that the resources be dynamically assigned in response to changes in demand.  The definition goes on to say that there is also a sense of location independence in that the consumer has no control over the location where the computing takes place.  It is important to distinguish “control over” from “knowledge of”.  The consumer may well know which specific data centre the resources are running in, particularly in the case of a private cloud.  There may also be limitations for compliance or security purposes on where the resources can be drawn from.  The important point is that the consumer cannot pick and choose between resources of a particular class; they are assigned interchangeable resources by the provider, wherever they happen to reside, within the limits of the service agreement.
  4. Rapid elasticity: Capabilities need to be rapidly provisioned when demanded.  The definition does not specify how rapidly, but the intent is that it be in a matter of minutes at most.  The service must be able to scale up and back down in response to changes in demand at a rate that allows potentially unpredictably varying demands to be satisfied in real time.  Ideally, the scaling is automatic in response to demand changes, but need not be.  The definition then goes on to say that the resources often appear to be infinite to the consumer and can be provisioned in any quantity at any time.  This is of course not a rigid requirement.  A cloud service could put an upper bound on the resources a particular consumer could scale to, and all clouds ultimately have a fixed capacity, so this clearly falls in the “grand illusion” category.
  5. Measured service: The NIST definition specifies that cloud systems monitor and automatically optimize utilization of resources.  The definition does not specify the units of measurement, and in fact Amazon and Google’s cloud services meter and charge using very different models (in thumbnail, Amazon in terms of infrastructure resources and Google by page hits).  What is surprising is that the definition does not state that the consumer is charged in proportion to usage, which many definitions consider the most fundamental tenet of cloud computing.  The NIST definition allows a situation, for example, where several consumers (say, members of a trade organization) decide to fund and build a computing facility meeting the five requirements and share its use, but don’t charge back based on usage even though it were possible.

There’s a lot to like about the NIST definition and it is the one I’ll be using in subsequent articles.  We’ll be digging into what people and organizations are actually doing with cloud computing (without all the hype and hyperbole), and practical considerations for success from both the business and technical viewpoints.

Larry Simon is an IT strategist who has advises startups through to Fortune 500 corporations and government institutions on achieving maximum return from their information technology investments through sound governance and practices.

Should Business Strategy be Influenced by Technological Considerations?

Can business strategy be created in isolation of the technology considerations? There is a widespread belief in the Business Community that Business Strategy comes first and then technology follows in some way to support that business.

In my experience the common perception among organizations is that Business defines its strategy first and then technology enables the strategy.

Strategy Development Process:

In order to explore the role technology plays in shaping and supporting the business, let’s look at how strategies are developed.  There has been a significant amount of research done and published in understanding how strategies are developed.  Here are some relevant highlights.

There are two main dimensions to strategy development.

  1. Visionary thinking based on intuition, a sense, an ability to make bold predictions and define goals.
  2. Strategy development is largely based on scientific analysis, considering options and recommendations based on the analysis followed by implementation.
    • Strategic Analysis guided by scientific approach understanding your markets, competitors, value chain, bargaining power of the key stakeholders.  It also entails understanding the strengths and weaknesses of your organization against the opportunities and threat that the external environment presents
    • Strategy Formulation guided by analytical findings, alignment to the vision and overall goals of the organization to create a strategic road-map
    • Strategy Implementation is of course converting the strategy to real results by successfully implementing the strategy

It is the strategy development that is the focus of this article. Specifically, strategic analysis which then guides the strategy formulation and implementation.

Is there a place for technological consideration in strategic analysis? The answer is quite apparent as demonstrated through examples next.

Technological Influences on the Business Landscape

Examples of technologies that have had transformation impact on business value chain and have redefined markets and distribution channels are all around us.

The globalization phenomenon enabled by the internet is one of most profound. The Internet has impacted all the traditional dimensions of business strategy (reduction in barriers to entry, increased market size across the globe without limitations of geographic divide, increased competition etc.).

Financial services industry is a prime example of an industry where technology has transformed the value chain, redefined competitive forces and given the consumers tremendous amount of bargaining power.  Entry barrier have been declining, new competitor have emerged. Some financial products and services have become more transparent and commodities making the market more competitive. Internet as a tool to create a new service delivery channel (reduced channel costs, 24 by7 availability) has put pressure on the more traditional branch based channels. The resulting service delivery cost structure has changed. ING is operating on the model that bricks and mortar are not required to sell its banking products and services.

Healthcare value chain has been transformed by technological advances, linking healthcare records through electronic information exchange, diagnostic imaging from traditional film based to digital imaging has redefined the value chain and changed the balance of power between the suppliers, buyers not to mention the very nature of the products and services being delivered.

Retail Industry is another such example where technology has changed the business landscape.  Amazon’s strategic business model was completely defined by technology.

Relationship between Business and Technology

Given how profoundly technology has influenced our business and personal lives, it is hard to fathom how a successful business strategy can be defined without considering technological influences and enablers.  By creating a partnership between Business and Technology at the Strategy development stage, you are creating a strategy that is well formed and can maximize business value and competitive positioning by embedding technological considerations from the very start (and not an after thought!).

So why is it that there is a significant divide between the Business and Technology?  In subsequent articles, I will focus on why there is this barrier (real or perceived) that creates this divide between Business and Technology.

If you have examples to demonstrate the benefits of business/technology partnerships, please share your thoughts on this forum.

Content Management Systems as Cities – I feel like a Mayor!

I recently realized that large enterprise content management (ECM) systems are like a city, but most ECM practices treat them as if they were a building. There’s a big difference in complexity that impacts the operation of an ECM system.Architects can design a building to suit its intended purpose and building management can maintain it. In the same manner an ECM expert can design a system to manage digital content in support of particular business processes. Much of the ECM literature talks of the benefits of clear system architecture and good governance.As an ECM system is deployed across an organization the breadth and number of applications grows rapidly – often into the hundreds – with many different business sponsors and champions! It becomes increasingly hard for any one person to understand all of the different ways that a system is being used, and to exert any effective control. The flexibility accorded users through collaborative, social tools further increases the heterogeneity of an ECM system.Not all ECM application deployments meet with equal success or longevity. In many ways the applications in an ECM system resemble buildings in a city – different sizes, different ages, different investments and different degrees of success. Some buildings are abandoned and some never get off the drawing board!No one designs cities – they are just too complex. Sure there are examples of attempts to do this – the initial design of Brasilia or the redesign of the center of Paris by Haussmann – but over time the efforts and activities of many other people determine how a city develops. In fact cities are very much an expression of human behaviour, culture and society.Overall city management falls to the Mayor and City Council, and their most important tools are Building Regulations and Permits, Ordnances, etc. While you can’t and shouldn’t control everything in a city, you can nevertheless provide some direction and minimal standards. The architects of the many buildings need to get approval for their plans before a building is constructed, and the building operators need to comply with other standards.When ECM was a new concept, the focus was on how to best design and operate a first application for the new system – a new ‘building’ standing in a ‘green field’ if you will. As ECM matures we need to think about how to operate large, multi-application systems. For me a better role analogy for the person with overall system responsibility is Mayor, not Architect. It’s not that we don’t need ECM Architects – in fact we need many of them – but we also need a Mayor and Council to provide a framework for oversight and long-term strategy. And we have to accept at least a degree of disorder that results from the activities of many different people that are only loosely coordinated – Mayors are necessarily politicians, unlike Architects!

Job One in the upgrade of a major ECM system

“Upgrade it and they will run away!” is a risk scenario with any major upgrade of a business-critical, enterprise system, including an enterprise content management (ECM) system.

Often the people promoting an upgrade are technologists who are almost always ‘early adopters’, but many staff just want to get their job done and will often be confused by, resent or even resist changes – telling typical users that they will get a whole bunch of ‘cool, new features’ isn’t likely to make them enthusiasts.

Here’s a typical persona of such a user:

  • Doesn’t read corporate communications (newsletters, emails, etc.)
  • Doesn’t like technology
  • Couldn’t care less about the product or site provided it ‘works’
  • Just wants to ‘do their job’ without external disruption

One of the big challenges is to ensure that when such a persona comes to work on the Monday after a major upgrade that they don’t say, “What the *#% happened to the site,” especially when the interface has changed.

I’m struggling with these issues in advance of a major ECM system upgrade. The system is called Ollie and has been in production for 15 years. It now has over 5.5 million objects and 4,000 users – 93% of whom use the system every month. It’s actually the main internal Enterprise Library of Open Text and is pretty much an un-customized version of the product we sell now called Content Server.

  • Content Server version 10 is just about to be released. It is the latest iteration of a product first called Livelink, and provides the underlying shared services of the Open Text ECM Suite.

Without doubt the newer version provides a better, more modern interface that will be preferred by most users – once they learn what’s different and how to use it. I know most users will prefer it as it has undergone extensive usability testing – but I also know that you can’t please all of the people all of the time and most people don’t like surprises at work.

So ‘job one’ is to create a short, effective video that overcomes the shock of the unexpected, since no matter how good our communications strategy is, many people will be surprised. The video also has to smooth the way for further change, because while some of the benefits of the new version will be available on Day One, others depend on subsequent work by knowledge managers using new capabilities that become available after the upgrade.

Content Matters

I was chatting with a colleague yesterday and he related how he interviews people to join our company. We quickly dropped into role playing – with me as the job candidate. He had a compelling proposition, but as I told him, he was missing the thing that excited me = Content Matters!

As I started to tell him why content matters I found myself getting excited. I realized I’m actually quite passionate about it! Not content itself, but what it enables and how it’s used.

Content matters to companies in a way that changes how they work, how they create value and whether they succeed. It matters whether they recognize that fact or not.

If you want to understand what drives a company look at their value chain – how they create value – and how they are organized to execute each stage in the value chain. Within each stage there are typically many processes, each with many steps. At almost every step there is some content that is created, reviewed, followed or otherwise used; how well this is done makes a difference to effectiveness.

It’s no surprise to people that you can understand a business by ‘following the money’ or ‘following the customer’ and that is the basis for ERP and CRM systems.

On the other hand most people are only just coming to realize that ‘following the content’ is just as important, so while we’ve talked about content management for many years, that conversation is starting to be important to business.

The Myth of Real-time Collaborative Authoring

In the document management field there has been a succession of products designed to support users working on a document at the same time, even if they are in different locations. These products have failed. They have failed because people don’t work on documents together very often.

I wonder where the belief in concurrent creation of documents came from. In the physical world you seldom see people saying, “Come to my office and we’ll write a document together,” so why expect users to want to do it virtually?

Documents may well be created to summarize a brainstorming session or record the minutes of a general meeting, but the designated author usually ‘goes away’ to somewhere quiet to write the first draft.

Even in the review phase, reviewers independently make comments, suggestions and edits at different times. The author then pulls these together to make a revised version. Email is no different, especially since emails of any length are essentially documents.

Sure the stepwise, asynchronous approach to content authoring and review takes place over a longer period, but it actually makes best use of each participant’s time, and is therefore more efficient overall.

I started to think about this again with yesterday’s announcement that Google Wave will not be further developed (http://googleblog.blogspot.com/2010/08/update-on-google-wave.html). As the blog post says, Google Wave was, “…a web app for real time communication and collaboration”. For the purposes of this discussion let’s consider both collaboration and communication independently.

Collaboration in Authoring

A technical tour-de-force, Wave enabled users to see others changing content as they themselves changed it. Very cool, but actually disconcerting. I wouldn’t have wanted you to have watched me author this blog post, for several reasons:

  • I’m easily distracted and need to concentrate to develop some cohesive thoughts
  • While writing I jump around adding sections, changing others, moving text blocks – it would be hard to follow and I’d have to explain what I was doing which would further slow me down and distract me
  • I’m the World’s worst typist

I’m probably no different than most people, at least regarding the first two points. And perhaps more lethal to the concept of concurrent authoring:

  • You’d get bored – it takes far longer to author a document than read it, and you’d probably want to be doing something else while I work, preferring to comment on my finished work

And that’s the crux of the matter – most people are busy, with many demands on their time, and collaborative authoring is just too inefficient.

Communication Delays are Good

While Wave was designed for collaboration, it was also intended for communication (see quotation above). Essentially email and instant messaging rolled into one. But I think there is a problem there too – most people actually don’t want to use real-time communication!!

Many commentators have remarked on the tendency for young people to use their mobile phones for text messaging far more than as telephones. You’d think it would be easier to engage in a conversation by talk rather than typing, so why is texting preferred?

I think people prefer texting because it allows them to be engaged in many, independent conversations with different people. For this to work they need to be able to send and receive messages in real time, but also need an agreed expectation that replies may take several minutes. Awkward silences of several minutes on a phone aren’t agreeable, and since voice isn’t cached locally like a text message you have to listen to each voice channel concurrently – which isn’t practical.

Interestingly while they are short, both mobile text messages and instant messages (IM) are generally only sent when they are completed. It is usually enough to see that the recipient is typing (i.e. with Instant Messaging) or to just assume that they are (i.e. with texting). All stumbles, pauses, and corrections are not sent – but they were with Google Wave.

Summary

With small pieces of content: true real-time communication is often undesirable, with near real-time being better.

With larger pieces of content: collaborative authoring is best done asynchronously.

Collaborative authoring seems to be something that IT professionals believe will lead to greater efficiencies, while end users don’t have the time for it!

Social collaboration for productivity and problem solving

Check out this SlideShare Presentation from my colleague Deb Lavoy, covering the Open Text Social Workplace (OTSW) offering. Just this week we put an OTSW system dubbed ‘Hub’ into full production use for Open Text staff (now over 4,000 users). Social collaboration for productivity and problem solving View more presentations from dllavoy.

What’s in a name? Or do you mean what I think you do? Implications for enterprise content culture

Most people love a good rant, especially when it is well-founded, and I’m no different.And so it was that I really enjoyed Laurence Hart’s recent, self-admitted rant (http://wordofpie.com/2010/03/04/a-rant-against-cms/). In his Word-of-Pie blog, Laurence railed against his perceived miss-use of the term ‘Content Management Systems’ or CMS. It was topical, well-informed, and most importantly to me, resonated on several fronts.In brief, Laurence’s position is that:“…All you Web CMS people need to give the term CMS back! It doesn’t belong to you. A long time ago you took it while the broader content community was trying to futz with the term ECM [Enterprise Content Management]. By the time we realized what was happening, you had taken the term…”His issue is that while web content management (WCM) is a valid description, it is too often abbreviated to content management (CMS), even though there are a wide range of content types beyond web pages. The common use of CMS is much narrower than is implied. Enterprise Content Management (ECM) was coined in part to describe all content that an enterprise might have.I’m not interested in the semantic debate about what each term means and what is the correct term to use.I am interested in what this discussion says about culture and the difficulty getting people in an enterprise to take a broad view of content.There seem to be at least ‘two solitudes’ in content management – ECM and CMS.It is interesting how specific technology applications shape and restrict expectations.Last year my employer, Open Text, acquired Vignette (history), one of the oldest and most established CMS vendors. Most of Open Text heritage is from document and record management (Livelink and Hummingbird eDOCS for example), process management (IXOS) and collaboration; in other words ECM. We published a trilogy of books on ECM in 2003-2005. While some staff came from acquisitions prior to Vignette that had expertise in WCM (notably RedDot), they represented a comparatively small portion of the Company. The Vignette acquisition brought a much larger group of CMS-oriented staff to Open Text.I think Open Text is richer for the breadth of perspectives, but we have had to work through the challenge to merge the different cultures. Note I’m not talking corporate cultures, as indeed the companies were quite similar, but rather the application culture of how best to manage content to meet all the needs of our enterprise customers. Each of us has tended to think mostly of some content types, some approaches to content management, and some business needs.Take Open Text’s own Intranet as an example. Open Text has been running an Intranet called ‘Ollie’ on Livelink technology since 1996. Fundamentally the Livelink model is one of web folders containing ‘documents’ of any type. This model works really well when individual and team work products to be shared are typically documents – so it’s great in supporting teams and managing records. However, linked webpages are a much better vehicle to support the dissemination of centrally managed content, especially information from an organization to its staff. So last year we broadened our Intranet Systems to include a true WCM capability in parallel.For some in Open Text, the internal use of WCM came none too soon, while for others it was a surprise! I had to make a video to ‘educate’ staff on why we had both approaches and how to choose the best system for their specific needs. It turned out to be easiest to provide context by talking about the parallel evolution of ECM and WCM technologies over the course of the last 15 to 20 years.The application of social media in an enterprise has also challenged cultural expectations.Those with a WCM background have generally talked about the advantages of working closely with customers through external websites. Most of their value propositions of breaking down barriers and being more transparent are absolute anathemas to those ECM practitioners who have focussed on internal process and records management for compliance.Traditional document management approaches provide another example of cultural expectations nurtured by specific technology experiences.As I mentioned above, Livelink used a web folder paradigm to organize content. It also had rich metadata capabilities, but users tend to think of these as supplementary or optional ways of organizing content. It is fair to say that most users tend to think first and foremost of folders – so it can be a challenge to collect metadata from them. In contrast, with our eDOCS content management system (from Hummingbird) there are no folders – everything is organized through metadata. eDOCS users find browsing folders can be frustrating. Going forward these alternate approaches are merged in our Open Text Content Server 2010 under our ECM Suite.Defining effective taxonomies to organize content can be one of the biggest challenges for an enterprise.Generally people in specific departments, and using specific systems, tend to define taxonomies that meet their immediate needs, but the taxonomies they create are generally too limited for wider use. Similarly, other groups create incompatible taxonomies often to address similar needs. These limitations ultimately contribute to failure. Creating new taxonomies seems to be a recurring theme in many enterprises as most are never broad enough, scalable or robust.Ironically then, what a person means by ‘content’, the ‘content taxonomy’ they think is required for their organization, and their perception of the critical features of a ‘content management system’ are all highly subjective!

Really looking forward to Virtual Content World – other ways to be ‘virtual’

You’ve probably heard about the first Open Text Virtual Content World (www.opentext.com/virtualcw) on Tuesday 19 January 2010. Hopefully you can attend. I’ll certainly be there in a virtual sense. It’s not too late to register, and if you attended Content World 2008 you’ll have received a code promotional code for free registration.

For those who can’t attend, there is an even ‘more virtual’ and dare I say free option – watch the many postings on twitter and Facebook.

The twitter hashtag is #otvcw.

The volume of tweets will really pickup on Tuesday if the Content World 2009 experience is any guide, not just from the ‘official’ event twitter account (@OTContentWorld) but of course from other OT staff like me, and most importantly, customers.

It should be a great event. There as certainly been a lot of organizational activity. Colleagues have told me this virtual event has been as much work as an in-person one.

‘See’ you there!

Twitter: @MartinSS

Syndicated at http://conversations.opentext.com/

Some Thoughts on Effective Enterprise Taxonomies for Content

While discussions of content management are usually referred to in the context of the needs of enterprises (i.e. Enterprise Content Management, ECM) in reality most ECM deployments start at the departmental level. This is not bad – on the contrary departmental deployments typically address specific business needs thereby sharpening their focus to improve their chances of success. However, over time as an enterprise begins to deploy ECM technologies in many departments, the benefits of an enterprise strategy to support cross-enterprise deployment become apparent.