Category Archives: Model Thinking

Building a Lean Financial Infrastructure!

A lean financial infrastructure presumes the ability of every element in the value chain to preserve and generate cash flow. That is the fundamental essence of the lean infrastructure that I espouse. So what are the key elements that constitute a lean financial infrastructure?

And given the elements, what are the key tweaks that one must continually make to ensure that the infrastructure does not fall into entropy and the gains that are made fall flat or decay over time. Identification of the blocks and monitoring and making rapid changes go hand in hand.

lean

The Key Elements or the building blocks of a lean finance organization are as follows:

  1. Chart of Accounts: This is the critical unit that defines the starting point of the organization. It relays and groups all of the key economic activities of the organization into a larger body of elements like revenue, expenses, assets, liabilities and equity. Granularity of these activities might lead to a fairly extensive chart of account and require more work to manage and monitor these accounts, thus requiring incrementally a larger investment in terms of time and effort. However, the benefits of granularity far exceeds the costs because it forces management to look at every element of the business.
  2. The Operational Budget: Every year, organizations formulate the operational budget. That is generally a bottoms up rollup at a granular level that would map to the Chart of Accounts. It might follow a top-down directive around what the organization wants to land with respect to income, expense, balance sheet ratios, et al. Hence, there is almost always a process of iteration in this step to finally arrive and lock down the Budget. Be mindful though that there are feeders into the budget that might relate to customers, sales, operational metrics targets, etc. which are part of building a robust operational budget. var
  3. The Deep Dive into Variances: As you progress through the year and part of the monthly closing process, one would inquire about how the actual performance is tracking against the budget. Since the budget has been done at a granular level and mapped exactly to the Chart of Accounts, it thus becomes easier to understand and delve into the variances. Be mindful that every element of the Chart of Account must be evaluated. The general inclination is to focus on the large items or large variances, while skipping the small expenses and smaller variances. That method, while efficient, might not be effective in the long run to build a lean finance organization. The rule, in my opinion, is that every account has to be looked and the question should be – Why? If the management has agreed on a number in the budget, then why are the actuals trending differently. Could it have been the budget and that we missed something critical in that process? Or has there been a change in the underlying economics of the business or a change in activities that might be leading to these “unexpected variances”. One has to take a scalpel to both – favorable and unfavorable variances since one can learn a lot about the underlying drivers. It might lead to managerially doing more of the better and less of the worse. Furthermore, this is also a great way to monitor leaks in the organization. Leaks are instances of cash that are dropping out of the system. Much of little leaks amounts to a lot of cash in total, in some instances. So do not disregard the leaks. Not only will that preserve the cash but once you understand the leaks better, the organization will step up in efficiency and effectiveness with respect to cash preservation and delivery of value.  deep dive
  4. Tweak the process: You will find that as you deep dive into the variances, you might want to tweak certain processes so these variances are minimized. This would generally be true for adverse variances against the budget. Seek to understand why the variance, and then understand all of the processes that occur in the background to generate activity in the account. Once you fully understand the process, then it is a matter of tweaking this to marginally or structurally change some key areas that might favorable resonate across the financials in the future.
  5. The Technology Play: Finally, evaluate the possibilities of exploring technology to surface issues early, automate repetitive processes, trigger alerts early on to mitigate any issues later, and provide on-demand analytics. Use technology to relieve time and assist and enable more thinking around how to improve the internal handoffs to further economic value in the organization.

All of the above relate to managing the finance and accounting organization well within its own domain. However, there is a bigger step that comes into play once one has established the blocks and that relates to corporate strategy and linking it to the continual evolution of the financial infrastructure.

The essential question that the lean finance organization has to answer is – What can the organization do so that we address every element that preserves and enhances value to the customer, and how do we eliminate all non-value added activities? This is largely a process question but it forces one to understand the key processes and identify what percentage of each process is value added to the customer vs. non-value added. This can be represented by time or cost dimension. The goal is to yield as much value added activities as possible since the underlying presumption of such activity will lead to preservation of cash and also increase cash acquisition activities from the customer.

The Unbearable Lightness of Being

Where the mind is without fear and the head is held high
Where knowledge is free
Where the world has not been broken up into fragments
By narrow domestic walls
Where words come out from the depth of truth
Where tireless striving stretches its arms towards perfection
Where the clear stream of reason has not lost its way
Into the dreary desert sand of dead habit
Where the mind is led forward by thee
Into ever-widening thought and action
Into that heaven of freedom, my Father, let my country awake.

–        Rabindranath  Tagore

Among the many fundamental debates in philosophy, one of the fundamental debates has been around the concept of free will. The debates have stemmed around two arguments associated with free will.

1)      Since future actions are governed by the circumstances of the present and the past, human beings future actions are predetermined on account of the learnings from the past.  Hence, the actions that happen are not truly a consequent of free will.

2)      The counter-argument is that future actions may not necessarily be determined and governed by the legacy of the present and the past, and hence leaves headroom for the individual to exercise free will.

Now one may wonder what determinism or lack of it has anything to do with the current state of things in an organizational context.  How is this relevant? Why are the abstract notions of determinism and free will important enough to be considered in the context of organizational evolution?  How does the meaning lend itself to structured institutions like business organizations, if you will, whose sole purpose is to create products and services to meet the market demand.

So we will throw a factual wrinkle in this line of thought. We will introduce now an element of chance. How does chance change the entire dialectic? Simply because chance is an unforeseen and random event that may not be pre-determined; in fact, a chance event may not have a causal trigger. And chance or luck could be meaningful enough to untether an organization and its folks to explore alternative paths.  It is how the organization and the people are aligned to take advantage of that random nondeterministic future that could make a huge difference to the long term fate of the organization.

The principle of inductive logic states that what is true for n and n+1 would be true for n+2.  The inductive logic creates predictability and hence organizations create pathways to exploit the logical extension of inductive logic. It is the most logical apparatus that exists to advance groups in a stable but robust manner to address the multitude of challenges that that they have to grapple with. After all, the market is governed by animal spirits! But let us think through this very carefully.  All competition or collaboration that occurs among groups to address the market demands result in homogenous behavior with general homogeneous outcomes.  Simply put, products and services become commoditized. Their variance is not unique and distinctive.  However, they could be just be distinctive enough to eke out enough profits in the margins before being absorbed into a bigger whole. At that point, identity is effaced over time.  Organizations gravitate to a singularity.  Unique value propositions wane over time.

So let us circle back to chance.  Chance is our hope to create divergence. Chance is the factoid that cancels out the inductive vector of industrial organization. Chance does not exist … it is not a “waiting for Godot” metaphor around the corner.  If it always did, it would have been imputed by the determinists in their inductive world and we would end up with a dystopian homogenous future.  Chance happens.  And sometimes it has a very short half-life. And if the organization and people are aligned and their mindset is adapted toward embracing and exploiting that fleeting factoid of chance, the consequences could be huge.  New models would emerge, new divergent paths would be traduced and society and markets would burst into a garden of colorful ideas in virtual oasis of new markets.

So now to tie this all to free will and to the unbearable lightness of being! It is the existence of chance that creates the opportunity to exercise free will on the part of an individual, but it is the organizations responsibility to allow the individual to unharness themselves from organization inertia. Thus, organizations have to perpetuate an environment wherein employees are afforded some headroom to break away.  And I don’t mean break away as in people leaving the organization to do their own gigs; I mean breakaway in thought and action within the boundaries of the organization to be open to element of chance and exploit it. Great organizations do not just encourage the lightness of being … unharnessing the talent but rather – the great organizations are the ones that make the lightness of being unbearable.  These individuals are left with nothing but an awareness and openness to chance to create incredible values … far more incredible and awe inspiring and momentous than a more serene state of general business as usual affairs.

Importance of Heroes and Narratives in Organizations

“My own heroes are the dreamers, those men and women who tried to make the world a better place than when they found it, whether in small ways or great ones. Some succeeded, some failed, most had mixed results… but it is the effort that’s heroic, as I see it. Win or lose, I admire those who fight the good fight.” – George Martin

 

Stories, like people and butterflies and songbirds’ eggs and human hearts and dreams, are also fragile things, made up of nothing stronger or more lasting than twenty-six letters and a handful of punctuation marks. Or they are words on the air, composed of sounds and ideas-abstract, invisible, gone once they’ve been spoken-and what could be more frail than that? But some stories, small, simple ones about setting out on adventures or people doing wonders, tales of miracles and monsters, have outlasted all the people who told them, and some of them have outlasted the lands in which they were created.” – Neil Gaiman

Heroes are not born. Circumstance and happenstance create heroes. In some cases, heroes are individuals who walk into a minefield of uncertainty that threatens their natural inclination for self-preservation in the interest of value systems and people that are alien to the individual. Thus, a private in an army is a hero already in the fact that he/she is walking into possible harm’s way and serving a cause to serve and protect people not necessarily related to him/her. One has heard the adage – one man’s freedom fighter is another person’s terrorist.  Thus, someone whom we call a terrorist may be perceived a hero by someone else. Thus, in this case …it all becomes a matter of a point of view, but the fundamental point remains – a hero is considered a person who abnegates and abjures their rights to self-preservation for some greater perceived good.

Sustaining innovation is a vital yet difficult task. Innovation requires the coordinated efforts of many actors to facilitate (1) the recombination of ideas to generate novelty, (2) real-time problem solving, and (3) linkages between present innovation efforts with past experiences and future aspirations. Innovation narratives are cultural mechanisms that address these coordination requirements by enabling translation. Specifically, innovation narratives are powerful mechanisms for translating ideas across the organization so that they are comprehensible and appear legitimate to others. Narratives also enable people to translate emergent situations that are ambiguous or equivocal so as to promote real-time problem solving. With their accumulation, innovation narratives provide a generative memory for organizations that enable people to translate ideas accumulated from particular instances of past innovation to inform current and future efforts.

The concept of collective identity has gained prominence within organizational theory as researchers have studied how it consequentially shapes organizational behavior. However, much less attention has been paid to the question of how nascent collective identities become legitimated. Although it is conventionally argued that membership expansion leads to collective identity legitimacy, one draws on the notion of cultural entrepreneurship to argue that the relationship is more complex and is culturally mediated by the stories told by group members. Legitimacy is more likely to be achieved when members articulate a clear defining collective identity story that identifies the group’s orienting purpose and core practices. Although membership expansion can undermine legitimation by introducing discrepant actors and practices to a collective identity, this potential downside is mitigated by compelling narratives, which help to coordinate expansion. And that is where the heroes can be interwoven into organizational theory and behavior. It is important to create environments that by happenstance and circumstance create heroes. The architecture of great organizations imputes heroes and narratives in their tapestry.

Heroes and narratives are instrumental in organizations that forge a pathway to long-term sustenance and growth. Hence, we are quick to idolize figures – Iacocca, Welch, Jobs, Ellison, Gates, Benioff, Gerstner, Branson, Bezos, Zuckerberg, Brin and Page, etc.  We learn narratives through case studies, news print, scholarly books on successful companies; and we emulate and steal and copy and parody and so much more … not necessarily because we want to be them but we want to create our identity in our own lair in ecosystems that move with or against the strongest currents.

So it is essential to celebrate the heroes and the narratives of great companies as an additional instrument to ignite engagement and foray into uncharted territories and conquer the unknown. Hence, personally I have also found solace in reading biographies of people who have made a difference, and a great pleasure in vicariously living through the ebbs and troughs of great companies

 

Medici Effect – Encourage Innovation in the Organization

“Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn’t really do it, they just saw something. It seemed obvious to them after a while. That’s because they were able to connect experiences they’ve had and synthesize new things. And the reason they were able to do that was that they’ve had more experiences or they have thought more about their experiences than other people.”
– Steve Jobs

What is the Medici Effect?

Frans Johanssen has written a lovely book on the Medici Effect. The term “Medici” relates to the Medici family in Florence that made immense contributions in art, architecture and literature. They were pivotal in catalyzing the Renaissance, and some of the great artists and scientists that we revere today – Donatello, Michelangelo, Leonardo da Vinci, and Galileo were commissioned for their works by the family.

Renaissance was the resurgence of the old Athenian democracy. It merged distinctive areas of humanism, philosophy, sciences, arts and literature into a unified body of knowledge that would advance the cause of human civilization. What the Medici effect speaks to is the outcome that is the result of creating a system that would incorporate what on first glance, may seem distinctive and discrete disciplines, into holistic outcomes and a shared simmering of wisdom that permeated the emergence of new disciplines, thoughts and implementations.


Supporting the organization to harness the power of the Medici Effect

We are past the industrial era, the Progressive era and the Information era. There are no formative lines that truly distinguish one era from another, but our knowledge has progressed along gray lines that have pushed the limits of human knowledge. We are now wallowing in a crucible wherein distinct disciplines have crisscrossed and merged together. The key thesis in the Medici effect is that the intersections of these distinctive disciplines enable the birth of new breakthrough ideas and leapfrog innovation.

So how do we introduce the Medici Effect in organizations?

Some of the key ways to implement the model is really to provide the support infrastructure for
1. Connections: Our brains are naturally wired toward associations. We try to associate a concept with contextual elements around that concept to give the concept more meaning. We learn by connecting concepts and associating them, for the most part, with elements that we are conversant in. However, one can create associations within a narrow parameter, constrained within certain semantic models that we have created. Organizations can hence channelize connections by implementing narrow parameters. On the other hand, connections can be far more free-form. That means that the connector thinks beyond the immediate boundaries of their domain or within certain domains that are “pre-ordained”. In those cases, we create what is commonly known as divergent thinking. In that approach, we cull elements from seemingly different areas but we thread them around some core to generate new approaches, new metaphors, and new models. Ensuring that employees are able to safely reach out to other nodes of possibilities is the primary implementation step to generate the Medici effect.
2. Collaborations: Connecting different streams of thought in different disciplines is a primary and formative step. To advance this further, organization need to be able to provide additional systems wherein people can collaborate among themselves. In fact, the collaboration impact accentuates the final outcome sooner. So enabling connections and collaboration work in sync to create what I would call – the network impact on a marketplace of ideas.
3. Learning Organization: Organizations need to continuously add fuel to the ecosystem. In other words, they need to bring in speakers, encourage and invest in training programs, allow exploration possibilities by developing an internal budget for that purpose and provide some time and degree of freedom for people to mull over ideas. This enables collaboration to be enriched within the context of diverse learning.
4. Encourage Cultural Diversity: Finally, organizations have to invest in cultural diversity. People from different cultures have varied viewpoints and information and view issues from different perspectives and cultures. Given the fact that we are more globalized now, the innate understanding and immersion in cultural experience enhances the Medici effect. It also creates innovation and ground-breaking thoughts within a broader scope of compassion, humanism, social and shared responsibilities.


Implementing systems to encourage the Medici effect will enable organizations to break out from legacy behavior and trammel into unguarded territories. The charter toward unknown but exciting possibilities open the gateway for amazing and awesome ideas that engage the employees and enable them to beat a path to the intersection of new ideas.

The Big Data Movement: Importance and Relevance today?

We are entering into a new age wherein we are interested in picking up a finer understanding of relationships between businesses and customers, organizations and employees, products and how they are being used,  how different aspects of the business and the organizations connect to produce meaningful and actionable relevant information, etc. We are seeing a lot of data, and the old tools to manage, process and gather insights from the data like spreadsheets, SQL databases, etc., are not scalable to current needs. Thus, Big Data is becoming a framework to approach how to process, store and cope with the reams of data that is being collected.

According to IDC, it is imperative that organizations and IT leaders focus on the ever-increasing volume, variety and velocity of information that forms big data.

  • Volume. Many factors contribute to the increase in data volume – transaction-based data stored through the years, text data constantly streaming in from social media, increasing amounts of sensor data being collected, etc. In the past, excessive data volume created a storage issue. But with today’s decreasing storage costs, other issues emerge, including how to determine relevance amidst the large volumes of data and how to create value from data that is relevant.
  • Variety. Data today comes in all types of formats – from traditional databases to hierarchical data stores created by end users and OLAP systems, to text documents, email, meter-collected data, video, audio, stock ticker data and financial transactions. By some estimates, 80 percent of an organization’s data is not numeric! But it still must be included in analyses and decision making.
  • Velocity. According to Gartner, velocity “means both how fast data is being produced and how fast the data must be processed to meet demand.” RFID tags and smart metering are driving an increasing need to deal with torrents of data in near-real time. Reacting quickly enough to deal with velocity is a challenge to most organizations.

SAS has added two additional dimensions:

  • Variability. In addition to the increasing velocities and varieties of data, data flows can be highly inconsistent with periodic peaks. Is something big trending in the social media? Daily, seasonal and event-triggered peak data loads can be challenging to manage – especially with social media involved.
  • Complexity. When you deal with huge volumes of data, it comes from multiple sources. It is quite an undertaking to link, match, cleanse and transform data across systems. However, it is necessary to connect and correlate relationships, hierarchies and multiple data linkages or your data can quickly spiral out of control. Data governance can help you determine how disparate data relates to common definitions and how to systematically integrate structured and unstructured data assets to produce high-quality information that is useful, appropriate and up-to-date.

 

So to reiterate, Big Data is a framework stemming from the realization that the data has gathered significant pace and that it’s growth has exceeded the capacity for an organization to handle, store and analyze the data in a manner that offers meaningful insights into the relationships between data points.  I am calling this a framework, unlike other materials that call Big Data a consequent of the inability of organizations to handle mass amounts of data. I refer to Big Data as a framework because it sets the parameters around an organizations’ decision as to when and which tools must be deployed to address the data scalability issues.

Thus to put the appropriate parameters around when an organization must consider Big Data as part of their analytics roadmap in order to understand the patterns of data better, they have to answer the following  ten questions:

  1. What are the different types of data that should be gathered?
  2. What are the mechanisms that have to be deployed to gather the relevant data?
  3. How should the data be processed, transformed and stored?
  4. How do we ensure that there is no single point of failure in data storage and data loss that may compromise data integrity?
  5. What are the models that have to be used to analyze the data?
  6. How are the findings of the data to be distributed to relevant parties?
  7. How do we assure the security of the data that will be distributed?
  8. What mechanisms do we create to implement feedback against the data to preserve data integrity?
  9. How do we morph the big data model into new forms that accounts for new patterns to reflect what is meaningful and actionable?
  10. How do we create a learning path for the big data model framework?

Some of the existing literature have commingled Big Data framework with analytics. In fact, the literature has gone on to make a rather assertive statement i.e. that Big Data and predictive analytics be looked upon in the same vein. Nothing could be further from the truth!

There are several tools available in the market to do predictive analytics against a set of data that may not qualify for the Big Data framework. While I was the CFO at Atari, we deployed business intelligence tools using Microstrategy, and Microstrategy had predictive modules. In my recent past, we had explored SAS and Minitab tools to do predictive analytics. In fact, even Excel can do multivariate, ANOVA and regressions analysis and best curve fit analysis. These analytical techniques have been part of the analytics arsenal for a long time. Different data sizes may need different tools to instantiate relevant predictive analysis. This is a very important point because companies that do not have Big Data ought to seriously reconsider their strategy of what tools and frameworks to use to gather insights. I have known companies that have gone the Big Data route, although all data points ( excuse my pun), even after incorporating capacity and forecasts, suggest that alternative tools are more cost-effective than implementing Big Data solutions. Big Data is not a one-size fit-all model. It is an expensive implementation. However, for the right data size which in this case would be very large data size, Big Data implementation would be extremely beneficial and cost effective in terms of the total cost of ownership.

Areas where Big Data Framework can be applied!

Some areas lend themselves to the application of the Big Data Framework.  I have identified broadly four key areas:

  1. Marketing and Sales: Consumer behavior, marketing campaigns, sales pipelines, conversions, marketing funnels and drop-offs, distribution channels are all areas where Big Data can be applied to gather deeper insights.
  2. Human Resources: Employee engagement, employee hiring, employee retention, organization knowledge base, impact of cross-functional training, reviews, compensation plans are elements that Big Data can surface. After all, generally over 60% of company resources are invested in HR.
  3. Production and Operational Environments: Data growth, different types of data appended as the business learns about the consumer, concurrent usage patterns, traffic, web analytics are prime examples.
  4. Financial Planning and Business Operational Analytics:  Predictive analytics around bottoms-up sales, marketing campaigns ROI, customer acquisitions costs, earned media and paid media, margins by SKU’s and distribution channels, operational expenses, portfolio evaluation, risk analysis, etc., are some of the examples in this category.

Hadoop: A Small Note!

Hadoop is becoming a more widely accepted tool in addressing Big Data Needs.  It was invented by Google so they could index the structural and text information that they were collecting and present meaningful and actionable results to the users quickly. It was further developed by Yahoo that tweaked Hadoop for enterprise applications.

Hadoop runs on a large number of machines that don’t share memory or disks. The Hadoop software runs on each of these machines. Thus, if you have for example – over 10 gigabytes of data – you take that data and spread that across different machines.  Hadoop tracks where all these data resides! The servers or machines are called nodes, and the common logical categories around which the data is disseminated are called clusters.  Thus each server operates on its own little piece of the data, and then once the data is processed, the results are delivered to the main client as a unified whole. The method of reducing the disparate sources of information residing in various nodes and clusters into one unified whole is the process of MapReduce, an important mechanism of Hadoop. You will also hear something called Hive which is nothing but a data warehouse. This could be a structured or unstructured warehouse upon which the Hadoop works upon, processes data, enables redundancy across the clusters and offers a unified solution through the MapReduce function.

Personally, I have always been interested in Business Intelligence. I have always considered BI as a stepping stone, in the new age, to be a handy tool to truly understand a business and develop financial and operational models that are fairly close to the trending insights that the data generates.  So my ear is always to the ground as I follow the developments in this area … and though I have not implemented a Big Data solution, I have always been and will continue to be interested in seeing its applications in certain contexts and against the various use cases in organizations.

 

MECE Framework, Analysis, Synthesis and Organization Architecture toward Problem-Solving

MECE is a thought tool that has been systematically used in McKinsey. It stands for Mutually Exclusive, Comprehensively Exhaustive.  We will go into both these components in detail and then relate this to the dynamics of an organization mindset. The presumption in this note is that the organization mindset has been engraved over time or is being driven by the leadership. We are looking at MECE since it represents a tool used by the most blue chip consulting firm in the world. And while doing that, we will , by the end of the article, arrive at the conclusion that this framework alone will not be the panacea to all investigative methodology to assess a problem – rather, this framework has to reconcile with the active knowledge that most things do not fall in the MECE framework, and thus an additional system framework is needed to amplify our understanding for problem solving and leaving room for chance.

So to apply the MECE technique, first you define the problem that you are solving for. Once you are past the definition phase, well – you are now ready to apply the MECE framework.

MECE is a framework used to organize information which is:

  1. Mutually exclusive: Information should be grouped into categories so that each category is separate and distinct without any overlap; and
  2. Collectively exhaustive: All of the categories taken together should deal with all possible options without leaving any gaps.

In other words, once you have defined a problem – you figure out the broad categories that relate to the problem and then brainstorm through ALL of the options associated with the categories. So think of  it as a mental construct that you move across a horizontal line with different well defined shades representing categories, and each of those partitions of shades have a vertical construct with all of the options that exhaustively explain those shades. Once you have gone through that exercise, which is no mean feat – you will be then looking at an artifact that addresses the problem. And after you have done that, you individually look at every set of options and its relationship to the distinctive category … and hopefully you are well on your path to coming up with relevant solutions.

Now some may argue that my understanding of MECE is very simplistic. In fact, it may very well be. But I can assure you that it captures the essence of very widely used framework in consulting organizations. And this framework has been imported to large organizations and have cascaded down to different scale organizations ever since.

Here is a link that would give you a deeper understanding of the MECE framework:

http://firmsconsulting.com/2010/09/22/a-complete-mckinsey-style-mece-decision-tree/

Now we are going to dig a little deeper.  Allow me to digress and take you down a path less travelled. We will circle back to MECE and organizational leadership in a few moments. One of the memorable quotes that have left a lasting impression is by a great Nobel Prize winning physicist, Richard Feynman.

“I have a friend who’s an artist and has sometimes taken a view which I don’t agree with very well. He’ll hold up a flower and say “look how beautiful it is,” and I’ll agree. Then he says “I as an artist can see how beautiful this is but you as a scientist takes this all apart and it becomes a dull thing,” and I think that he’s kind of nutty. First of all, the beauty that he sees is available to other people and to me too, I believe. Although I may not be quite as refined aesthetically as he is … I can appreciate the beauty of a flower. At the same time, I see much more about the flower than he sees. I could imagine the cells in there, the complicated actions inside, which also have a beauty. I mean it’s not just beauty at this dimension, at one centimeter; there’s also beauty at smaller dimensions, the inner structure, also the processes. The fact that the colors in the flower evolved in order to attract insects to pollinate it is interesting; it means that insects can see the color. It adds a question: does this aesthetic sense also exist in the lower forms? Why is it aesthetic? All kinds of interesting questions which the science knowledge only adds to theexcitement, the mystery and the awe of a flower! It only adds. I don’t understand how it subtracts.”

The above quote by Feynman lays the groundwork to understand two different approaches – namely, the artist approaches the observation of the flower from the synthetic standpoint, whereas Feynman approaches it from an analytic standpoint. Both do not offer views that are antithetical to one another: in fact, you need both to gather a holistic view and arrive at a conclusion – the sum is greater than the parts. Feynman does not address the essence of beauty that the artist puts forth; he looks at the beauty of how the components and its mechanics interact well and how it adds to our understanding of the flower.  This is very important because the following dialogue with explore another concept to drive this difference between analysis and synthesis home.

There are two possible ways of gaining knowledge. Either we can proceed from the construction of the flower ( the Feynman method) , and then seek to determine the laws of the mutual interaction of its parts as well as its response to external stimuli; or we can begin with what the flower accomplishes and then attempt to account for this. By the first route we infer effects from given causes, whereas by the second route we seek causes of given effects. We can call the first route synthetic, and the second analytic.

 

We can easily see how the cause effect relationship is translated into a relationship between the analytic and synthetic foundation.

 

A system’s internal processes — i.e. the interactions between its parts — are regarded as the cause of what the system, as a unit, performs. What the system performs is thus the effect. From these very relationships we can immediately recognize the requirements for the application of the analytic and synthetic methods.

 

The synthetic approach — i.e. to infer effects on the basis of given causes — is therefore appropriate when the laws and principles governing a system’s internal processes are known, but when we lack a detailed picture of how the system behaves as a whole.

Another example … we do not have a very good understanding of the long-term dynamics of galactic systems, nor even of our own solar system. This is because we cannot observe these objects for the thousands or even millions of years which would be needed in order to map their overall behavior.

 

However, we do know something about the principles, which govern these dynamics, i.e. gravitational interaction between the stars and planets respectively. We can therefore apply a synthetic procedure in order to simulate the gross dynamics of these objects. In practice, this is done with the use of computer models which calculate the interaction of system parts over long, simulated time periods.

The analytical approach — drawing conclusions about causes on the basis of effects – is appropriate when a system’s overall behavior is known, but when we do not have clear or certain knowledge about the system’s internal processes or the principles governing these. On the other hand, there are a great many systems for which we neither have a clear and certain conception of how they behave as a whole, nor fully understand the principles at work which cause that behavior. Organizational behavior is one such example since it introduces the fickle spirits of the employees that, at an aggregate create a distinct character in the organization.

Leibniz was among the first to define analysis and synthesis as modern methodological concepts:

“Synthesis … is the process in which we begin from principles and [proceed to] build up theorems and problems … while analysis is the process in which we begin with a given conclusion or proposed problem and seek the principles by which we may demonstrate the conclusion or solve the problem.”

 

So we have wandered down this path of analysis and synthesis and now we will circle back to MECE and the organization. MECE framework is a prime example of the application of analytics in an organization structure. The underlying hypothesis is that the application of the framework will illuminate and add clarity to understanding the problems that we are solving for. But here is the problem:  the approach could lead to paralysis by analysis. If one were to apply this framework, one would lose itself in the weeds whereas it is just as important to view the forest.  So organizations have to step back and assess at what point we stop the analysis i.e. we have gathered information and at what point we set our roads to discovering a set of principles that will govern the action to solve a set of problems.  It is almost always impossible to gather all information to make the best decision – especially where speed, iteration, distinguishing from the herd quickly, stamping a clear brand etc. are becoming the hallmarks of great organizations.

Applying the synthetic principle in addition to “MECE think” leaves room for error and sub-optimal solutions. But it crowd sources the limitless power of imagination and pattern thinking that will allow the organization to make critical breakthroughs in innovative thinking. It is thus important that both the principles are promulgated by the leadership as coexisting principles that drive an organization forward. It ignites employee engagement, and it imputes the stochastic errors that result when employees may not have all the MECE conditions checked off.

 

In conclusion, it is important that the organization and its leadership set its architecture upon the traditional pillars of analysis and synthesis – MECE and systems thinking.  And this architecture serves to be the springboard for the employees that allows for accidental discoveries, flights of imagination, Nietzschean leaps that transform the organization toward the pathway of innovation, while still grounded upon the bedrock of facts and empirical observations.