Blog Archives

The Big Data Movement: Importance and Relevance today?

We are entering into a new age wherein we are interested in picking up a finer understanding of relationships between businesses and customers, organizations and employees, products and how they are being used,  how different aspects of the business and the organizations connect to produce meaningful and actionable relevant information, etc. We are seeing a lot of data, and the old tools to manage, process and gather insights from the data like spreadsheets, SQL databases, etc., are not scalable to current needs. Thus, Big Data is becoming a framework to approach how to process, store and cope with the reams of data that is being collected.

According to IDC, it is imperative that organizations and IT leaders focus on the ever-increasing volume, variety and velocity of information that forms big data.

  • Volume. Many factors contribute to the increase in data volume – transaction-based data stored through the years, text data constantly streaming in from social media, increasing amounts of sensor data being collected, etc. In the past, excessive data volume created a storage issue. But with today’s decreasing storage costs, other issues emerge, including how to determine relevance amidst the large volumes of data and how to create value from data that is relevant.
  • Variety. Data today comes in all types of formats – from traditional databases to hierarchical data stores created by end users and OLAP systems, to text documents, email, meter-collected data, video, audio, stock ticker data and financial transactions. By some estimates, 80 percent of an organization’s data is not numeric! But it still must be included in analyses and decision making.
  • Velocity. According to Gartner, velocity “means both how fast data is being produced and how fast the data must be processed to meet demand.” RFID tags and smart metering are driving an increasing need to deal with torrents of data in near-real time. Reacting quickly enough to deal with velocity is a challenge to most organizations.

SAS has added two additional dimensions:

  • Variability. In addition to the increasing velocities and varieties of data, data flows can be highly inconsistent with periodic peaks. Is something big trending in the social media? Daily, seasonal and event-triggered peak data loads can be challenging to manage – especially with social media involved.
  • Complexity. When you deal with huge volumes of data, it comes from multiple sources. It is quite an undertaking to link, match, cleanse and transform data across systems. However, it is necessary to connect and correlate relationships, hierarchies and multiple data linkages or your data can quickly spiral out of control. Data governance can help you determine how disparate data relates to common definitions and how to systematically integrate structured and unstructured data assets to produce high-quality information that is useful, appropriate and up-to-date.

 

So to reiterate, Big Data is a framework stemming from the realization that the data has gathered significant pace and that it’s growth has exceeded the capacity for an organization to handle, store and analyze the data in a manner that offers meaningful insights into the relationships between data points.  I am calling this a framework, unlike other materials that call Big Data a consequent of the inability of organizations to handle mass amounts of data. I refer to Big Data as a framework because it sets the parameters around an organizations’ decision as to when and which tools must be deployed to address the data scalability issues.

Thus to put the appropriate parameters around when an organization must consider Big Data as part of their analytics roadmap in order to understand the patterns of data better, they have to answer the following  ten questions:

  1. What are the different types of data that should be gathered?
  2. What are the mechanisms that have to be deployed to gather the relevant data?
  3. How should the data be processed, transformed and stored?
  4. How do we ensure that there is no single point of failure in data storage and data loss that may compromise data integrity?
  5. What are the models that have to be used to analyze the data?
  6. How are the findings of the data to be distributed to relevant parties?
  7. How do we assure the security of the data that will be distributed?
  8. What mechanisms do we create to implement feedback against the data to preserve data integrity?
  9. How do we morph the big data model into new forms that accounts for new patterns to reflect what is meaningful and actionable?
  10. How do we create a learning path for the big data model framework?

Some of the existing literature have commingled Big Data framework with analytics. In fact, the literature has gone on to make a rather assertive statement i.e. that Big Data and predictive analytics be looked upon in the same vein. Nothing could be further from the truth!

There are several tools available in the market to do predictive analytics against a set of data that may not qualify for the Big Data framework. While I was the CFO at Atari, we deployed business intelligence tools using Microstrategy, and Microstrategy had predictive modules. In my recent past, we had explored SAS and Minitab tools to do predictive analytics. In fact, even Excel can do multivariate, ANOVA and regressions analysis and best curve fit analysis. These analytical techniques have been part of the analytics arsenal for a long time. Different data sizes may need different tools to instantiate relevant predictive analysis. This is a very important point because companies that do not have Big Data ought to seriously reconsider their strategy of what tools and frameworks to use to gather insights. I have known companies that have gone the Big Data route, although all data points ( excuse my pun), even after incorporating capacity and forecasts, suggest that alternative tools are more cost-effective than implementing Big Data solutions. Big Data is not a one-size fit-all model. It is an expensive implementation. However, for the right data size which in this case would be very large data size, Big Data implementation would be extremely beneficial and cost effective in terms of the total cost of ownership.

Areas where Big Data Framework can be applied!

Some areas lend themselves to the application of the Big Data Framework.  I have identified broadly four key areas:

  1. Marketing and Sales: Consumer behavior, marketing campaigns, sales pipelines, conversions, marketing funnels and drop-offs, distribution channels are all areas where Big Data can be applied to gather deeper insights.
  2. Human Resources: Employee engagement, employee hiring, employee retention, organization knowledge base, impact of cross-functional training, reviews, compensation plans are elements that Big Data can surface. After all, generally over 60% of company resources are invested in HR.
  3. Production and Operational Environments: Data growth, different types of data appended as the business learns about the consumer, concurrent usage patterns, traffic, web analytics are prime examples.
  4. Financial Planning and Business Operational Analytics:  Predictive analytics around bottoms-up sales, marketing campaigns ROI, customer acquisitions costs, earned media and paid media, margins by SKU’s and distribution channels, operational expenses, portfolio evaluation, risk analysis, etc., are some of the examples in this category.

Hadoop: A Small Note!

Hadoop is becoming a more widely accepted tool in addressing Big Data Needs.  It was invented by Google so they could index the structural and text information that they were collecting and present meaningful and actionable results to the users quickly. It was further developed by Yahoo that tweaked Hadoop for enterprise applications.

Hadoop runs on a large number of machines that don’t share memory or disks. The Hadoop software runs on each of these machines. Thus, if you have for example – over 10 gigabytes of data – you take that data and spread that across different machines.  Hadoop tracks where all these data resides! The servers or machines are called nodes, and the common logical categories around which the data is disseminated are called clusters.  Thus each server operates on its own little piece of the data, and then once the data is processed, the results are delivered to the main client as a unified whole. The method of reducing the disparate sources of information residing in various nodes and clusters into one unified whole is the process of MapReduce, an important mechanism of Hadoop. You will also hear something called Hive which is nothing but a data warehouse. This could be a structured or unstructured warehouse upon which the Hadoop works upon, processes data, enables redundancy across the clusters and offers a unified solution through the MapReduce function.

Personally, I have always been interested in Business Intelligence. I have always considered BI as a stepping stone, in the new age, to be a handy tool to truly understand a business and develop financial and operational models that are fairly close to the trending insights that the data generates.  So my ear is always to the ground as I follow the developments in this area … and though I have not implemented a Big Data solution, I have always been and will continue to be interested in seeing its applications in certain contexts and against the various use cases in organizations.

 

Implementing Balanced Scorecard Model for Employee Engagement

The Balanced Scorecard Model (BSC) was introduced by Kaplan & Norton in their book “The Balanced Scorecard” (1996). It is one of the more widely used management tools in large organizations.

One of the major strengths of the BSC model is how the key categories in the BSC model links to corporate missions and objectives. The key categories which are referred to as “perspectives” illustrated in the BSC model are:

Financial Perspective:

Kaplan and Norton do not disregard the traditional need for financial data. Timely and accurate data will always be a priority, and managers will do whatever necessary to provide it. In fact, often there is more than enough handling and processing of financial data. With the implementation of a corporate database, it is hoped that more of the processing can be centralized and automated. But the point is that the current emphasis on financials leads to the “unbalanced” situation with regard to other perspectives. There is perhaps a need to include additional financial-related data, such as risk assessment and cost-benefit data, in this category.

Customer Perspective

Recent management philosophy has shown an increasing realization of the importance of customer focus and customer satisfaction in any business. These are leading indicators: if customers are not satisfied, they will eventually find other suppliers that will meet their needs. Poor performance from this perspective is thus a leading indicator of future decline, even though the current financial picture may look good. In developing metrics for satisfaction, customers should be analyzed in terms of kinds of customers and the kinds of processes for which we are providing a product or service to those customer groups

Internal Business Process Perspective

This perspective refers to internal business processes. Metrics based on this perspective allow the managers to know how well their business is running, and whether its products and services conform to customer requirements (the mission). These metrics have to be carefully designed by those who know these processes most intimately; with our unique missions these are not necessarily something that can be developed by outside consultants. My personal opinion on this matter is that the internal business process perspective is too important and that internal owners or/and teams take ownership of understanding the process.

Learning and Growth Perspective

This perspective includes employee training and corporate cultural attitudes related to both individual and corporate self-improvement. In a knowledge-worker organization, people — the only repository of knowledge — are the main resource. In the current climate of rapid technological change, it is becoming necessary for knowledge workers to be in a continuous learning mode. Metrics can be put into place to guide managers in focusing training funds where they can help the most. In any case, learning and growth constitute the essential foundation for success of any knowledge-worker organization.

Kaplan and Norton emphasize that ‘learning’ is more than ‘training’; it also includes things like mentors and tutors within the organization, as well as that ease of communication among workers, the engagement of the workers, the potential of cross-training that would create pockets of bench strength and switch hitters, and other employee specific programs that allows them to readily get help on a problem when it is needed. It also includes technological tools; what the Baldrige criteria call “high performance work systems.”

Innovation Perspective

This perspective was appended to the above four by Bain and Company.  It refers to the vitality of the organization and its culture to provide the appropriate framework to encourage innovation. Organizations have to innovate. Innovation is becoming the key distinctive element in great organizations, and high levels of innovation or innovative thinking are talent magnets.

Taking the perspectives a step further, Kaplan and Cooper instituted measures and targets associated with each of those targets. The measures are geared around what the objective is associated with each of the perspectives rather than a singular granule item. Thus, if the objective is to increase customer retention, an appropriate metric or set of metrics is around how to measure the objective and track success to it than defining a customer.

One of the underlying presumptions in this model is to ensure that the key elements around which objectives are defined are done so at a fairly detailed level and to the extent possible – defined so much so that an item does not have polymorphous connotations. In other words, there is and can be only a single source of truth associated with the key element. That preserves the integrity of the model prior to its application that would lead to the element branching out into a plethora of objectives associated with the element.

Objectives, Measures, Targets and Initiatives

 

Within each of the Balance Scorecard financial, customer, internal process, learning perspectives and innovation perspectives, the firm must define the following:

Strategic Objectives – what the strategy is to achieve in that perspective

Measures – how progress for that particular objective will be measured

Targets – the target value sought for each measure

Initiatives – what will be done to facilitate the reaching of the target?

As in models and analytics, the information that the model spouts could be rife with a cascade of metrics. Metrics are important but too many metrics associated with the perspectives may diffuse the ultimate end that the perspectives represent.

Hence, one has to exercise restraint and rigor in defining a few key metrics that are most relevant and roll up to corporate objectives. As an example, outlined below are examples of metrics associated with the perspectives:

Financial performance (revenues, earnings, return on capital, cash flow);

Customer value performance (market share, customer satisfaction measures, customer loyalty);

Internal business process performance (productivity rates, quality measures, timeliness);

Employee performance (morale, knowledge, turnover, use of best demonstrated practices);

Innovation performance (percent of revenue from new products, employee suggestions, rate of improvement index);

To construct and implement a Balanced Scorecard, managers should:

  • Articulate the business’s vision and strategy;
  • Identify the performance categories that best link the business’s vision and strategy to its results (e.g., financial performance, operations, innovation, and employee performance);
  • Establish objectives that support the business’s vision and strategy;
  • Develop effective measures and meaningful standards, establishing both short-term milestones and long-term targets;
  • Ensure company wide acceptance of the measures;
  • Create appropriate budgeting, tracking, communication, and reward systems;
  • Collect and analyze performance data and compare actual results with desired performance;
  • Take action to close unfavorable gaps.

Source : http://www.ascendantsmg.com/blog/index.cfm/2011/6/1/Balanced-Scorecard-Strategy-Map-Templates-and-Examples

The link above contains a number of templates and examples that you may find helpful.

I have discussed organization architecture and employee engagement in our previous blogs. The BSC is a tool to encourage engagement while ensuring a tight architecture to further organizational goals. You may forget that as an employee, you occupy an important place in the ecosystem; the forgetting does not speak to your disenchantment toward the job, neither to your disinclination toward the uber-goals of the organization. The forgetting really speaks to potentially a lack of credible leadership that has not taken the appropriate efforts to engage the organization by pushing this structure that forces transparency. The BSC is one such articulate model that could be used, even at its crudest form factor, to get employees informed and engaged.

Risk Management and Finance

If you are in finance, you are a risk manager. Say what? Risk management! Imagine being the hub in a spoke of functional areas, each of which is embedded with a risk pattern that can vary over time. A sound finance manager would be someone who would be best able to keep pulse, and be able to support the decisions that can contain the risk. Thus, value management becomes critical: Weighing the consequence of a decision against the risk that the decision poses. Not cost management, but value management. And to make value management more concrete, we turn to cash impact or rather – the discounted value of future stream of cash that may or may not be a consequent to a decision. Companies carry risks. If not, a company will not offer any premiums in value to the market. They create competitive advantage – defined as sorting a sustained growth in free cash flow as the key metric that becomes the separator.

John Kay, an eminent strategist, had identified four sources of competitive advantage: Organization Architecture and Culture, Reputation, Innovation and Strategic Assets. All of these are inextricably intertwined, and must be aligned to service value in the company. The business value approach underpins the interrelationships best. And in so doing, scenario planning emerges as a sound machination to manage risks. Understanding the profit impact of a strategy, and the capability/initiative tie-in is one of the most crucial conversations that a good finance manager could encourage in a company. Product, market and internal capabilities become the anchor points in evolving discussions. Scenario planning thus emerges in context of trends and uncertainties: a trend in patterns may open up possibilities, the latter being in the domain of uncertainty.

There are multiple methods one could use in building scenarios and engaging in fruitful risk assessment.
1.Sensitivity Assessment: Evaluate decisions in the context of the strategy’s reliance on the resilience of business conditions. Assess the various conditions in a scenario or mutually exclusive scenarios, assess a probabilistic guesstimate on success factors, and then offer simple solutions. This assessment tends to be heuristic oriented and excellent when one is dealing with few specific decisions to be made. There is an elevated sense of clarity with regard to the business conditions that may present itself. And this is most commonly used, but does not thwart the more realistic conditions where clarity is obfuscated and muddy.
2.Strategy Evaluation: Use scenarios to test a strategy by throwing a layer of interaction complexity. To the extent you can disaggregate the complexity, the evaluation of a strategy is better tenable. But once again, disaggregation has its downsides. We don’t operate in a vacuum. It is the aggregation, and negotiating through this aggregation effectively is where the real value is. You may have heard of the Mckinsey MECE (Mutually Exclusive; Comprehensively Exhaustive) methodology where strategic thrusts are disaggregated and contained within a narrow framework. The idea is that if one does that enough, one has an untrammeled confidence in choosing one initiative over another. That is true again in some cases, but my belief is that the world operates at a more synthetic level than pure analytic. We resort to analytics since it is too damned hard to synthesize, and be able to agree on an optimal solution. I am not creaming analytics; I am only suggesting that there is some possibility that a false hypothesis is accepted and a true one rejected. Thus analytics is an important tool, but must be weighed along with the synthetic tradition.
3.Synthetic Development: By far the most interesting and perhaps the most controversial with glint of academic and theoretical monstrosities included – this represents developing and broadcasting all scenarios equally weighed, and grouping interaction of scenarios. Thus, if introducing a multi-million dollar initiative in untested waters is a decision you have to weigh, one must go through the first two methods, and then review the final outcome against peripheral factors that were not introduced initially. A simple statement or realization like – The competition for Southwest is the Greyhound bus – could significantly alter the expanse of the strategy.

If you think of the new world of finance being nothing more than crunching numbers … stop and think again. Yes …crunching those numbers play a big part, less a cause than an effect of the mental model that you appropriate in this prized profession.