Tag: Financial Markets

What’s the difference between Business Continuity (BC) and Disaster Recovery (DR)?

What’s the difference between Business Continuity (BC) and Disaster Recovery (DR)? This is a question I have had to answer multiple times. It is a very good question and the answer is not simple! So, as a good lazy ‘techy’, I tried to find the answer on the web. That way, when I am asked, all I would have to do is send a link.

I have used this approach multiple times for other questions I have received. It is convenient and a great way to avoid re-typing an answer. However, this time, I was not very successful in my quest to find an answer. I searched the web, multiple times, for hours without finding the perfect “pre-written answer” I was looking for. So I decided to stop being lazy and write it myself.

Now, if you are like me, and you’ve been looking for an answer to this question, feel free to use this one.

So, let’s start with a few definitions from the Business Continuity Institute (BCI) Glossary:

Disaster Recovery (DR): “The strategies and plans for recovering and restoring the organizations technological infrastructure and capabilities after a serious interruption. Editor’s Note: DR is now normally only used in reference to an organization’s IT and telecommunications recovery.

Business Continuity (BC): “The strategic and tactical capability of the organization to plan for and respond to incidents and business disruptions in order to continue business operations at an acceptable predefined level.”

First, I’d like to say that I have a slightly different view of DR than BCI. Now, who am I to disagree with what BCI is saying? Well, bear with me a little longer and you will see how my interpretation of DR might help people understand the differences between DR and BC better. So here’s my definition:DR is the strategies and plans for recovering and restoring the organizations (scratch technological) infrastructures and capabilities after an interruption (regardless of the severity).

Unlike the BCI, I don’t make a distinction between the technological infrastructure and the rest of the infrastructures (the buildings for example) and nor I do differentiate between the types of interruptions. In my opinion, either a system is down or a building is burnt or flooded, both should be considered a disaster and therefore both require a disaster recovery plan.

Therefore DR is the action of fixing a failing, degraded or completely damaged infrastructure. For example, the 2nd floor of a building was on fire; the fire is now out so the initial crisis is over. Now the damage caused by fire must be dealt with; there is water and smoke on the 2nd floor, the 3rd floor has damages caused by smoke and the 1st floor has water damage. The cleanup, replacement of furniture, repair of the building and its structure, painting, plastering, etc. are all part of the disaster recovery plan.

What is Business Continuity then? Business Continuity is how you continue to maintain critical business functions during that crisis. Back to the example, when the fire started, the alarm went off and people were evacuated from the building. Let say you had a Call Center on the 2nd floor and this just happens to be a critical area of your business. How would you continue to answer calls while people are being evacuated? How would you answer calls while the building is being inspected, repaired or rebuilt? Keeping the business running during this time is what I call Business Continuity.

The same approach can be taken with a system crash or when the performance of a system has degraded to the point that it has impacted business operations. So fixing the system is DR and the action of keeping the business operations running without the system being available is BC.

In conclusion, BC is all about being proactive and sustaining critical business functions whatever it takes whereas DR is the process of dealing with the aftermath and ensuring the infrastructure (system, building, etc.) is restored to the pre-interruption state.

Cloud Computing Defined

Welcome to the first installment of what will be an on-going series on Cloud Computing.  Everyone in the industry is talking about it and the media is awash with hype.  I’ll be taking a different approach by trying to bring some clarity and reason to help you make informed decisions in this area.

The place to start is with a definition of the term.  There are a wide variety of sources that attempt to define Cloud Computing, many with subtly different nuances, and often including benefits (e.g. flexibility, agility) that are potential outcomes from going Cloud, but certainly don’t belong as part of its definition.

I prefer the U. S. Department of Commerce National Institute of Standards and Technology (NIST) draft definition which is well-considered, well-written and carries the weight of a standards body behind it.  Their definition sets out five essential characteristics:

  1. On-demand self-service: This means that a service consumer can add or delete computing resources without needing someone working for the service provider to take some action to enable it.
  2. Broad network access: Note that the NIST definition does not specify that the network is the Internet (as some other definitions do).  This is necessary to allow for private clouds.  NIST goes on to say that cloud services use standard mechanisms to promote use by a wide variety of client devices.  It’s not clear to me that this should be a full-fledged requirement, but is certainly in keeping with the spirit of the cloud concept.  Once could imagine a cloud that uses custom protocols agreed to by a closed group of consumers, but perhaps the word “standard” still applies in that it would be a standard across the consumer group.
  3. Resource pooling: Also known as multi-tenancy, this characteristic requires that the cloud serve multiple consumers, and that the resources be dynamically assigned in response to changes in demand.  The definition goes on to say that there is also a sense of location independence in that the consumer has no control over the location where the computing takes place.  It is important to distinguish “control over” from “knowledge of”.  The consumer may well know which specific data centre the resources are running in, particularly in the case of a private cloud.  There may also be limitations for compliance or security purposes on where the resources can be drawn from.  The important point is that the consumer cannot pick and choose between resources of a particular class; they are assigned interchangeable resources by the provider, wherever they happen to reside, within the limits of the service agreement.
  4. Rapid elasticity: Capabilities need to be rapidly provisioned when demanded.  The definition does not specify how rapidly, but the intent is that it be in a matter of minutes at most.  The service must be able to scale up and back down in response to changes in demand at a rate that allows potentially unpredictably varying demands to be satisfied in real time.  Ideally, the scaling is automatic in response to demand changes, but need not be.  The definition then goes on to say that the resources often appear to be infinite to the consumer and can be provisioned in any quantity at any time.  This is of course not a rigid requirement.  A cloud service could put an upper bound on the resources a particular consumer could scale to, and all clouds ultimately have a fixed capacity, so this clearly falls in the “grand illusion” category.
  5. Measured service: The NIST definition specifies that cloud systems monitor and automatically optimize utilization of resources.  The definition does not specify the units of measurement, and in fact Amazon and Google’s cloud services meter and charge using very different models (in thumbnail, Amazon in terms of infrastructure resources and Google by page hits).  What is surprising is that the definition does not state that the consumer is charged in proportion to usage, which many definitions consider the most fundamental tenet of cloud computing.  The NIST definition allows a situation, for example, where several consumers (say, members of a trade organization) decide to fund and build a computing facility meeting the five requirements and share its use, but don’t charge back based on usage even though it were possible.

There’s a lot to like about the NIST definition and it is the one I’ll be using in subsequent articles.  We’ll be digging into what people and organizations are actually doing with cloud computing (without all the hype and hyperbole), and practical considerations for success from both the business and technical viewpoints.

Larry Simon is an IT strategist who has advises startups through to Fortune 500 corporations and government institutions on achieving maximum return from their information technology investments through sound governance and practices.

Should Business Strategy be Influenced by Technological Considerations?

Can business strategy be created in isolation of the technology considerations? There is a widespread belief in the Business Community that Business Strategy comes first and then technology follows in some way to support that business.

In my experience the common perception among organizations is that Business defines its strategy first and then technology enables the strategy.

Strategy Development Process:

In order to explore the role technology plays in shaping and supporting the business, let’s look at how strategies are developed.  There has been a significant amount of research done and published in understanding how strategies are developed.  Here are some relevant highlights.

There are two main dimensions to strategy development.

  1. Visionary thinking based on intuition, a sense, an ability to make bold predictions and define goals.
  2. Strategy development is largely based on scientific analysis, considering options and recommendations based on the analysis followed by implementation.
    • Strategic Analysis guided by scientific approach understanding your markets, competitors, value chain, bargaining power of the key stakeholders.  It also entails understanding the strengths and weaknesses of your organization against the opportunities and threat that the external environment presents
    • Strategy Formulation guided by analytical findings, alignment to the vision and overall goals of the organization to create a strategic road-map
    • Strategy Implementation is of course converting the strategy to real results by successfully implementing the strategy

It is the strategy development that is the focus of this article. Specifically, strategic analysis which then guides the strategy formulation and implementation.

Is there a place for technological consideration in strategic analysis? The answer is quite apparent as demonstrated through examples next.

Technological Influences on the Business Landscape

Examples of technologies that have had transformation impact on business value chain and have redefined markets and distribution channels are all around us.

The globalization phenomenon enabled by the internet is one of most profound. The Internet has impacted all the traditional dimensions of business strategy (reduction in barriers to entry, increased market size across the globe without limitations of geographic divide, increased competition etc.).

Financial services industry is a prime example of an industry where technology has transformed the value chain, redefined competitive forces and given the consumers tremendous amount of bargaining power.  Entry barrier have been declining, new competitor have emerged. Some financial products and services have become more transparent and commodities making the market more competitive. Internet as a tool to create a new service delivery channel (reduced channel costs, 24 by7 availability) has put pressure on the more traditional branch based channels. The resulting service delivery cost structure has changed. ING is operating on the model that bricks and mortar are not required to sell its banking products and services.

Healthcare value chain has been transformed by technological advances, linking healthcare records through electronic information exchange, diagnostic imaging from traditional film based to digital imaging has redefined the value chain and changed the balance of power between the suppliers, buyers not to mention the very nature of the products and services being delivered.

Retail Industry is another such example where technology has changed the business landscape.  Amazon’s strategic business model was completely defined by technology.

Relationship between Business and Technology

Given how profoundly technology has influenced our business and personal lives, it is hard to fathom how a successful business strategy can be defined without considering technological influences and enablers.  By creating a partnership between Business and Technology at the Strategy development stage, you are creating a strategy that is well formed and can maximize business value and competitive positioning by embedding technological considerations from the very start (and not an after thought!).

So why is it that there is a significant divide between the Business and Technology?  In subsequent articles, I will focus on why there is this barrier (real or perceived) that creates this divide between Business and Technology.

If you have examples to demonstrate the benefits of business/technology partnerships, please share your thoughts on this forum.

The Implicit Value of Content is Realized Through Business Process

As I have noted before, much of the historic discussion in the document management field has concerned the cost of producing content, or the cost of finding existing content.But the value of a document, or any other piece of content, is seldom the same as its cost of production.I was chatting about this the other day with my colleague James Latham. He used an invoice as an example of a piece of content that may be managed by an enterprise content management (ECM) system. James noted that, ‘There is inherent or explicit value in an invoice’. In fact the value of an invoice is fairly tightly linked to the cash it represents.A $10 bill has an explicit value of $10. Likewise a delivered invoice for $10 has a value of about $10 to an organization. Arguably it is not quite as valuable as $10 cash given the delay and perhaps uncertainty of payment, but it is close enough in most cases and will be treated as such in an accounting system.There is a case where a $10 bill is worth much more: if it is a rare, old $10 bill, it may have a lot of implicit value (e.g. to collectors it may be worth hundreds of dollars) above its explicit value of $10.Tangible value (explicit plus implicit) is established by sale of the item itself or the recent valuations of comparable items. But it is hard to think of invoices, especially electronic invoices (i.e. digital content), as having any implicit value.Are there other kinds of enterprise content besides invoices that clearly have implicit value? I think so. Here’s a good example: documents that support a patent application for a product with large market potential may have huge implicit value that greatly exceeds their cost of production and their explicit value at a given moment. This implicit value may become more explicit over time with the issue of a patent, together with product and market advances. At some point an intellectual property sale could attribute very significant tangible value to the documentation.In this patent documentation example, the application of process over time helps to create tangible value. In ECM discussions we often speak of the context of content as helping to give it meaning, but clearly we also need to consider how process can give it value.

Enterprise Content Architecture – my take on the Metastorm acquisition

I’m particularly excited by today’s announcement acquisition of Metastorm by OpenText, but not perhaps for the same reasons as many others.What excites me is the potential of Metastorm’s strengths in Enterprise Architecture (EA) and Business Process Analysis (BPA). As noted in the release:“Metastorm is a leader in both BPA and EA as recognized by Gartner in the Gartner Magic Quadrant for Business Process Analysis Tools, published February 22, 2010 and the Gartner Magic Quadrant for Enterprise Architecture Tools, published October 28, 2010.”These capabilities play to both the ‘Enterprise’ and ‘Content’ in Enterprise Content Management (ECM).Organizations depend on a growing proportion of knowledge workers as I discussed in a previous post (Value for Knowledge Workers), but as noted in the McKinsey study I covered (Boosting the productivity of knowledge workers),  most organizations do not understand how to boost the productivity of knowledge workers or indeed the barriers to that productivity. As I noted:“What struck me in reading the article is that while an increasing proportion of staff in companies are knowledge workers, it is not clear what knowledge work is and how to best enable it to drive productivity gains. Given that, it is hardly surprising that people struggle to define the value of those software tools best able to support knowledge management.”Content is the currency of knowledge work. It supports the exchange of knowledge during business processes, and is very often the work product of such processes (e.g. a market analysis report, an engineering drawing or a website page). Too often in the past discussion of the value of content has centered on either reducing the unit cost of producing, finding or using content, or mitigating compliance risks created by poor content management.This is not a new theme for me, indeed last August I expressed my enthusiasm for why Content Matters. I noted:“It’s no surprise to people that you can understand a business by ‘following the money’ or ‘following the customer’ and that is the basis for ERP and CRM systems. On the other hand most people are only just coming to realize that ‘following the content’ is just as important, so while we’ve talked about content management for many years, that conversation is starting to be important to business.”The potential to apply Metastorm’s ProVision tool set to elucidate and illustrate the critical role of Content to the achievement of Enterprise Goals is an exciting one which offers new value to our customers.

Value for Knowledge Workers

Demonstrable value goes a long way to supporting the deployment of new software tools.

For structured business processes, return on investment (r.o.i.) is comparatively easy to estimate. Where unstructured or semi-structured digital content items (e.g. documents, spreadsheets, faxes, etc.) enable a given structured process (e.g. accounts receivable) their contribution to the overall value created is also typically quantifiable.

Where the process itself is unstructured the measurement of value is much harder. Perhaps the largest class of unstructured processes in a company fall in the category of knowledge work. The difficulties organizations have in understanding knowledge work is highlighted in an article just published in the McKinsey Quarterly entitled: “Boosting the productivity of knowledge workers”.

  • Aside: Unfortunately a subscription is required to read the full article – hopefully you have one.

The article starts with the proposition that few senior executives can answer the question: “Are you doing all that you can to enhance the productivity of your knowledge workers?” This is unfortunate because, “Organizations around the world struggle to crack the code for improving the effectiveness of managers, salespeople, scientists, and others whose jobs consist primarily of interactions—with other employees, customers, and suppliers—and complex decision making based on knowledge and judgment.”

The authors, Eric Matson and Laurence Prusak, describe five common barriers that hinder knowledge workers in more than half of the interactions in surveyed companies:

  1. Physical
  2. Technical
  3. Social or Cultural
  4. Contextual, and
  5. Temporal

Physical barriers include geographic and time zone separation between workers, and are typically linked to Technical challenges – where workers lack the necessary tools to overcome the physical barriers that separate them. As the article notes, there are a many software tools available that can help – these would include the various collaborative and social media tools, as well as the more classic document management applications that are encompassed in the broadest definitions of Enterprise Content Management (ECM).

Of course the availability of software tools does not guarantee that users will use them effectively; indeed, Social (e.g. organizational restrictions, opposing incentives and motivations) and Contextual barriers (e.g. not knowing who to consult or to trust) play a large part in hindering adoption.

The fifth barrier is Temporal. Time, or rather the perceived lack of it, is also a critical factor. In my experience knowledge workers do not consider time spent using social media and collaborative tools as important as other activities. Under time pressure they will stop using these tools if they need to spend more time on other activities they perceive as “real work”.

What struck me in reading the article is that while an increasing proportion of staff in companies are knowledge workers, it is clear that what knowledge work is and how to best enable it to drive productivity gains is not clear. Given that, it is hardly surprising that people struggle to define the value of those software tools best able to support knowledge management.

Content Matters

I was chatting with a colleague yesterday and he related how he interviews people to join our company. We quickly dropped into role playing – with me as the job candidate. He had a compelling proposition, but as I told him, he was missing the thing that excited me = Content Matters!

As I started to tell him why content matters I found myself getting excited. I realized I’m actually quite passionate about it! Not content itself, but what it enables and how it’s used.

Content matters to companies in a way that changes how they work, how they create value and whether they succeed. It matters whether they recognize that fact or not.

If you want to understand what drives a company look at their value chain – how they create value – and how they are organized to execute each stage in the value chain. Within each stage there are typically many processes, each with many steps. At almost every step there is some content that is created, reviewed, followed or otherwise used; how well this is done makes a difference to effectiveness.

It’s no surprise to people that you can understand a business by ‘following the money’ or ‘following the customer’ and that is the basis for ERP and CRM systems.

On the other hand most people are only just coming to realize that ‘following the content’ is just as important, so while we’ve talked about content management for many years, that conversation is starting to be important to business.

Considering the Cost & Value of Digital Content for an Enterprise

The way that the value of digital content changes over time, and how an enterprise content management (ECM) system might help to realize and/or retain greater value was the subject of my last post (http://martin-fulcrum.blogspot.com/2010/06/calculating-value-of-content-in-ecm.html). Lee Dallas retweeted that post, but also referenced a very interesting earlier blog post (2008) by fellow member of ‘Big Men on Content‘ Marko Sillanpääon the cost of content (link). Sillanpää considered content lifecycle costs as follows:Cost of Content = (Annual Authoring Costs + Annual Review Costs) / New Objects per AuthorContent authoring and review are not the only activities that incur cost – there are costs associated with each step in its lifecycle, notably including the costs of distribution, storage and ultimate destruction. Effective content distribution is becoming increasingly important to the realization of value.Cost and value are of course different concepts. The cost of an item does not necessarily reflect its value, as anyone who has watched the TV show “Antiques Roadshow” knows!In business, where there is an emphasis on the bottom line, the value of content ought on average to exceed its cost, or it should not have been created. But for a given piece of content, its cost is generally related to size and complexity, not what it enables. On the other hand, value is tied to enablement and varies over time – often declining gradually or precipitously, but sometimes increasing!It can be hard to explain to people how managing content benefits a business. However, I have found that identifying its ‘enterprise value’ is powerful. A good top-down approach is to reference the value chain of a business, using Michael Porter’s original simple model.People understand that enterprises take input from suppliers and partners and, through a series of steps, add value that can be realized in a final sale to customers. Clearly the effective execution of those steps adds to efficiency. When challenged, most people can identify content that contributes or is even essential to the completion of each of those value steps and their constituent processes. For example, an Engineering Department must create, review and approve engineering drawings, and then pass them on to the Manufacturing Department (see E, C & O value chain).In my experience, taking a value perspective is generally more attractive, especially in growth industries, than a cost and cost avoidance perspective – which has classically been the basis for return-on-investment (R.O.I.) approaches to software justification.  Syndicated at http://conversations.opentext.com/

The ‘Second Coming’ of Renditions – Video

Long time ECM veterans will remember the concept of document rendition – a transformed alternative. I think we’ll see renditions again.A rendition is essentially another form of a specific version of a document. There are two common types of renditions based on format and content:

  1. The same information content as the original document, but a different file format
  • For example, a spreadsheet file can be renditioned as a PDF
  1. The same file format as the original document, but different content
  • For example, a MS PowerPoint Document written in English can have a rendition that is also a PowerPoint file, but whose content has been translated into French

Renditions for limited bandwidth in the 90’sIn the 1990’s, one of the common use cases was to deal with the limited bandwidth available at the time. It often took a long time to download and open a document just to see if it contained what you were looking for. Accordingly, Open Text Livelink automatically made HTML renditions of many common formats such as MS Word that were much smaller files and so could be downloaded much faster for quick review.I remember presenting the use case to customers: “If you want to look quickly at a file without opening the full thing…” Back then bandwidth was so limited it made sense. Now it seldom does, although there are specific use-cases like renditions that contain added content like secured signatures that still have value.Bandwidth issues are backBandwidth is becoming limiting again – not for ‘simple’ text documents, but for rich media files such as videos. In fact bandwidth issues are so acute that the shape of the Internet has changed radically in the last few years. The explosive growth of video sharing has lead to the rise of Content Delivery or Distribution Networks (CDN) such as Akamai Technologies, Limelight Networks, CDNetworks and Amazon CloudFront to enable effective distribution.Akamai recently claimed they handle around 20% or the Internet traffic by volume – most of this traffic is rich media which must be delivered very quickly as users expect pages to load extremely quickly even if they contain a video. A recent Forrester report says the expected threshold to load has become two seconds.For video files to be useful to end users they have to start to play almost instantly. This is usually achieved by:

  • Locating a copy in close network proximity to the end user
    • CDNs use many distributed sites around the ‘edge of the Cloud’ to ensure that is at least one site close to an end user preloaded with files that are expected to be required
  • Reducing the size of the video through transcoding and compression
  • Streaming – starting to play before all of the content is received

The increasing use of mobile devices with narrow and unstable bandwidth connections, and different format requirements creates further hurdles to serving users rapidly.Enterprise needsSo what about the enterprise or corporate user? Trained by the web, he/she expects to click on a link and have a video start playing within two seconds. But most internal ECM systems (e.g. for document management) are designed to download a complete file before it is available to the end user.A story – Here’s a scenario I experienced recently. A Finance department prepared a new expense form. To show staff how to use it they prepared a five minute video. The trouble was that their WMV format video was over 300MB. For most staff in a global company, especially remote staff, downloading a 300MB file to view it is just not practical. What Finance needed was to be able to upload the video, and have the system take care of making a rendition that was transcoded and compressed, made stream-able and hosted on a CDN.There are just too many manual steps and too many options for most newcomers to video creation. Systems should take care of most of those steps. And one excellent way to execute several steps is to have the ECM system create a rendition of a deposited video that contains embed code to start a player and stream video from a CDN. The consumer users can then simply click on the object name in their ECM system and a streamed video starts to play almost instantly – as they have come to expect with sites such as YouTube.So renditions have a place in the new enterprise again to deal with bandwidth limitations!Syndicated at http://conversations.opentext.com/