Category Archives: Learning Process

Emergent Systems: Introduction

The whole is greater than the sum of its parts. “Emergent properties” refer to those properties that emerge that might be entirely unexpected. As discussed in CAS, they arise from the collaborative functioning of a system. In other words, emergent properties are properties of a group of items, but it would be erroneous for us to reduce such systems into properties of atomic elements and use those properties as binding elements to understand emergence Some common examples of emergent properties include cities, bee hives, ant colonies and market systems. Out thinking attributes causal effects – namely, that behavior of elements would cause certain behaviors in other hierarchies and thus an entity emerges at a certain state. However, we observe that a process of emergence is the observation of an effect without an apparent cause. Yet it is important to step back and regard the relationships and draw lines of attribution such that one can concur that there is an impact of elements at the lowest level that surfaces, in some manner, at the highest level which is the subject of our observation.

emergent

Jochenn Fromm in his paper “Types and Forms of Emergence” has laid this out best. He says that emergent properties are “amazing and paradox: fundamental but familiar.” In other words, emergent properties are changeless and changing, constant and fluctuating, persistent and shifting, inevitable and unpredictable. The most important note that he makes is that the emergent property is part of the system and at the same time it might not always be a part of the system. There is an undercurrent of novelty or punctuated gaps that might arise that is inexplicable, and it is this fact that renders true emergence virtually irreducible. Thus, failure is embodied in all emergent systems – failure being that the system does not behave according to expectation. Despite all rules being followed and quality thresholds are established at every toll gate at the highest level, there is still a possibility of failure which suggests that there is some missing information in the links. It is also possible that the missing information is dynamic – you do not step in the same water twice – which makes the study to predict emergent systems to be a rather difficult exercise. Depending on the lens through which we look at, the system might appear or disappear.

emergent cas

There are two types of emergence: Descriptive and Explanatory emergence. Descriptive emergence means that properties of wholes cannot be necessarily defined through the properties of the pasts. Explanatory emergence means laws of complex systems cannot be deduced from the laws of interaction of simpler elements that constitute it. Thus the emergence is a result of the amount of variety embodied in the system, the amount of external influence that weights and shapes the overall property and direction of the system, the type of resources that the system consumes, the type of constraints that the system is operating under and the number of levels of sub-systems that work together to build out the final system. Thus, systems can be benign as in the system is relatively more predictable whereas a radical system is a material departure of a system from expectation. If the parts that constitute a system is independent of its workings from other parts and can be boxed within boundaries, emergent systems become more predictable. A watch is an example of a system that follows the different mechanical elements in a watch that are geared for reading the time as it ultimate purpose. It is a good example of a complex physical system. However, these systems are very brittle – a failure in one point can cascade into a failure of the entire system. Systems that are more resilient are those where the elements interact and learn from one another. In other words, the behavior of the elements excites other elements – all of which work together to create a dance toward a more stable state. They deploy what is often called the flocking trick and the pheromone trick. Flocking trick is largely the emulation of the particles that are close to each other – very similar to the cellular automata as introduced by Neumann and discussed in the earlier chapter. The Pheromone trick reflects how the elements leave marks that are acted upon as signals by other elements and thus they all work together around these signal trails to behave and thus act as a forcing function to create the systems.

emerg strategy

There are systems that have properties of extremely strong emergence. What does Consciousness, Life, and Culture have in common? How do we look at Climate? What about the organic development of cities? These are just some examples of system where determinism is nigh impossible. We might be able to tunnel through the various and diverse elements that embody the system, but it would be difficult to coherently and tangibly draw all set of relationships, signals, effectors and detectors, etc. to grapple with a complete understanding of the system. Wrestling a strong emergent system would be a task that might even be outside the purview of the highest level of computational power available. And yet, these systems exist, and they emerge and evolve. Yet we try to plan for these systems or plan to direct policies to influence the system, not fully knowing the impact. This is also where the unintended consequences of our action might take free rein.

Complex Physical and Adaptive Systems

There are two models in complexity. Complex Physical Systems and Complex Adaptive Systems! For us to grasp the patterns that are evolving, and much of it seemingly out of our control – it is important to understand both these models. One could argue that these models are mutually exclusive. While the existing body of literature might be inclined toward supporting that argument, we also find some degree of overlap that makes our understanding of complexity unstable. And instability is not to be construed as a bad thing! We might operate in a deterministic framework, and often, we might operate in the realms of a gradient understanding of volatility associated with outcomes. Keeping this in mind would be helpful as we deep dive into the two models. What we hope is that our understanding of these models would raise questions and establish mental frameworks for intentional choices that we are led to make by the system or make to influence the evolution of the system.

 

Complex Physical Systems (CPS)

Complex Physical Systems are bounded by certain laws. If there are initial conditions or elements in the system, there is a degree of predictability and determinism associated with the behavior of the elements governing the overarching laws of the system. Despite the tautological nature of the term (Complexity Physical System) which suggests a physical boundary, the late 1900’s surfaced some nuances to this model. In other words, if there is a slight and an arbitrary variation in the initial conditions, the outcome could be significantly different from expectations. The assumption of determinism is put to the sword.  The notion that behaviors will follow established trajectories if rules are established and the laws are defined have been put to test. These discoveries by introspection offers an insight into the developmental block of complex physical systems and how a better understanding of it will enable us to acknowledge such systems when we see it and thereafter allow us to establish certain toll-gates and actions to navigate, to the extent possible, to narrow the region of uncertainty around outcomes.

universe

The universe is designed as a complex physical system. Just imagine! Let this sink in a bit. A complex physical system might be regarded relatively simpler than a complex adaptive system. And with that in mind, once again …the universe is a complex physical system. We are awed by the vastness and scale of the universe, we regard the skies with an illustrious reverence and we wonder and ruminate on what lies beyond the frontiers of a universe, if anything. Really, there is nothing bigger than the universe in the physical realm and yet we regard it as a simple system. A “Simple” Complex Physical System. In fact, the behavior of ants that lead to the sustainability of an ant colony, is significantly more complex: and we mean by orders of magnitude.

ant colony

Complexity behavior in nature reflects the tendency of large systems with many components to evolve into a poised “critical” state where minor disturbances or arbitrary changes in initial conditions can create a seemingly catastrophic impact on the overall system such that system changes significantly. And that happens not by some invisible hand or some uber design. What is fundamental to understanding complex systems is to understand that complexity is defined as the variability of the system. Depending on our lens, the scale of variability could change and that might lead to different apparatus that might be required to understand the system. Thus, determinism is not the measure: Stephen Jay Gould has argued that it is virtually impossible to predict the future. We have hindsight explanatory powers but not predictable powers. Hence, systems that start from the initial state over time might represent an outcome that is distinguishable in form and content from the original state. We see complex physical systems all around us. Snowflakes, patterns on coastlines, waves crashing on a beach, rain, etc.

Complex Adaptive Systems (CAS)

Complex adaptive systems, on the contrary, are learning systems that evolve. They are composed of elements which are called agents that interact with one another and adapt in response to the interactions.

cas

Markets are a good example of complex adaptive systems at work.

CAS agents have three levels of activity. As described by Johnson in Complexity Theory: A Short Introduction – the three levels of activity are:

  1. Performance (moment by moment capabilities): This establishes the locus of all behavioral elements that signify the agent at a given point of time and thereafter establishes triggers or responses. For example, if an object is approaching and the response of the agent is to run, that would constitute a performance if-then outcome. Alternatively, it could be signals driven – namely, an ant emits a certain scent when it finds food: other ants will catch on that trail and act, en masse, to follow the trail. Thus, an agent or an actor in an adaptive system has detectors which allows them to capture signals from the environment for internal processing and it also has the effectors that translate the processing to higher order signals that influence other agents to behave in certain ways in the environment. The signal is the scent that creates these interactions and thus the rubric of a complex adaptive system.
  2. Credit assignment (rating the usefulness of available capabilities): When the agent gathers experience over time, the agent will start to rely heavily on certain rules or heuristics that they have found useful. It is also typical that these rules may not be the best rules, but it could be rules that are a result of first discovery and thus these rules stay. Agents would rank these rules in some sequential order and perhaps in an ordinal ranking to determine what is the best rule to fall back on under certain situations. This is the crux of decision making. However, there are also times when it is difficult to assign a rank to a rule especially if an action is setting or laying the groundwork for a future course of other actions. A spider weaving a web might be regarded as an example of an agent expending energy with the hope that she will get some food. This is a stage setting assignment that agents have to undergo as well. One of the common models used to describe this best is called the bucket-brigade algorithm which essentially states that the strength of the rule depends on the success of the overall system and the agents that constitute it. In other words, all the predecessors and successors need to be aware of only the strengths of the previous and following agent and that is done by some sort of number assignment that becomes stronger from the beginning of the origin of the system to the end of the system. If there is a final valuable end-product, then the pathway of the rules reflect success. Once again, it is conceivable that this might not be the optimal pathway but a satisficing pathway to result in a better system.
  3. Rule discovery (generating new capabilities): Performance and credit assignment in agent behavior suggest that the agents are governed by a certain bias. If the agents have been successful following certain rules, they would be inclined toward following those rules all the time. As noted, rules might not be optimal but satisficing. Is improvement a matter of just incremental changes to the process? We do see major leaps in improvement … so how and why does this happen? In other words, someone in the process have decided to take a different rule despite their experiences. It could have been an accident or very intentional.

One of the theories that have been presented is that of building blocks. CAS innovation is a result of reconfiguring the various components in new ways. One quips that if energy is neither created, nor destroyed …then everything that exists today or will exist tomorrow is nothing but a reconfiguration of energy in new ways. All of tomorrow resides in today … just patiently waiting to be discovered. Agents create hypotheses and experiment in the petri dish by reconfiguring their experiences and other agent’s experiences to formulate hypotheses and the runway for discovery. In other words, there is a collaboration element that comes into play where the interaction of the various agents and their assignment as a group to a rule also sets the stepping stone for potential leaps in innovation.

Another key characteristic of CAS is that the elements are constituted in a hierarchical order. Combinations of agents at a lower level result in a set of agents higher up and so on and so forth. Thus, agents in higher hierarchical orders take on some of the properties of the lower orders but it also includes the interaction rules that distinguishes the higher order from the lower order.

Short History of Complexity

Complexity theory began in the 1930’s when natural scientists and mathematicians rallied together to get a deeper understanding of how systems emerge and plays out over time.  However, the groundwork of complexity theory began in the 1850’s with Darwin’s introduction to Natural Selection. It was further extended by Mendel’s genetic algorithms. Darwin’s Theory of Evolution has been posited as a slow gradual process. He says that “Natural selection acts only by taking advantage of slight successive variations; she can never take a great and sudden leap, but must advance by short and sure, though slow steps.” Thus, he concluded that complex systems evolve by leaps and the result is an organic formulation of an irreducibly complex system which is composed of many parts, all of which work together closely for the overall system to function. If any part is missing or does not act as expected, then the system becomes unwieldy and breaks down. So it was an early foray into distinguishing the emergent property of a system from the elements that constitute it. Mendel, on the other hand, laid out the property of inheritance across generations. An organic system inherits certain traits that are reconfigured over time and adapts to the environment, thus leading to the development of an organism which for our purposes fall in the realm of a complex outcome. One would imagine that there is a common thread between Darwin’s Natural Selection and Mendel’s laws of genetic inheritance. But that is not the case and that has wide implications in complexity theory. Mendel focused on how the traits are carried across time: the mechanics which are largely determined by some probabilistic functions. The underlying theory of Mendel hinted at the possibility that a complex system is a result of discrete traits that are passed on while Darwin suggests that complexity arises due continuous random variations.

 

darwin statement

In the 1920’s, literature suggested that a complex system has elements of both: continuous adaptation and discrete inheritance that is hierarchical in nature. A group of biologists reconciled the theories into what is commonly known as the Modern Synthesis. The principles guiding Modern Synthesis were: Natural Selection was the major mechanism for evolutionary change. Small random variations of genes and natural selection result in the origin of new species. Furthermore, the new species might have properties different than the elements that constitute. Modern Synthesis thus provided the framework around Complexity theory. What does this great debate mean for our purposes? Once we arrive at determining whether a system is complex, then how does the debate shed more light into our understanding of complexity. Does this debate shed light into how we regard complexity and how we subsequently deal with it? We need to further extend our thinking by looking at a few new developments that occurred in the 20th century that would give us a better perspective. Let us then continue our journey into the evolution of the thinking around complexity.

 

Axioms are statements that are self-evident. It serves to be a premise or starting point for further reasoning and arguments. An axiom thus is not contestable because if it, then all the following reasoning that is extended against axioms would fall apart. Thus, for our purposes and our understanding of complexity theory – A complex system has an initial state that is irreducible physically or mathematically.

 

One of the key elements in Complexity is computation or computability. In the 1930’s, Turing introduced the abstract concept of the Turing machine. There is a lot of literature that goes into the specifics of how the machine works but that is beyond the scope of this book. However, there are key elements that can be gleaned from that concept to better understand complex systems.  A complex system that evolves is a result of a finite number of steps that would solve a specific challenge. Although the concept has been applied in the boundaries of computational science, I am taking the liberty to apply this to emerging complex systems. Complexity classes help scientists categorize the problems based on how much time and space is required to solve problems and verify solutions. The complexity is thus a function of time and memory. This is a very important concept and we have radically simplified the concept to attend to a self-serving purpose: understand complexity and how to solve the grand challenges?  Time complexity refers to the number of steps required to solve a problem. A complex system might not necessarily be the most efficient outcome but is nonetheless an outcome of a series of steps, backward and forward to result in a final state. There are pathways or efficient algorithms that are produced and the mechanical states to produce them are defined and known. Space complexity refers to how much memory that the algorithm depends on to solve the problem.  Let us keep these concepts in mind as we round this all up into a more comprehensive work that we will relay at the end of this chapter.

Around the 1940’s, John von Neumann introduced the concept of self-replicating machines. Like Turing, Von Neumann’s would design an abstract machine which, when run, would replicate itself. The machine consists of three parts: a ‘blueprint’ for itself, a mechanism that can read any blueprint and construct the machine (sans blueprint) specified by that blueprint, and a ‘copy machine’ that can make copies of any blueprint. After the mechanism has been used to construct the machine specified by the blueprint, the copy machine is used to create a copy of that blueprint, and this copy is placed into the new machine, resulting in a working replication of the original machine. Some machines will do this backwards, copying the blueprint and then building a machine. The implications are significant. Can complex systems regenerate? Can they copy themselves and exhibit same behavior and attributes? Are emergent properties equivalent? Does history repeat itself or does it rhyme? How does this thinking move our understanding and operating template forward once we identify complex systems?

complexity-sciences dis

Let us step forward into the late 1960’s when John Conway started doing experiments extending the concept of the cellular automata. He introduced the concept of the Game of Life in 1970 as a result of his experiments. His main theses was simple : The game is a zero-player game, meaning that its evolution is determined by its initial state, requiring no further input. One interacts with the Game of Life by creating an initial configuration and observing how it evolves, or, for advanced players, by creating patterns with properties. The entire formulation was done on a two-dimensional universe in which patterns evolved over time. It is one of the finest examples in science of how a set of few simple non-arbitrary rules can result in an incredibly complex behavior that is fluid and provides a pleasing pattern over time. In other words, if one were an outsider looking in, you would see a pattern emerging from simple initial states and simple rules.  We encourage you to look at several patterns that many people have constructed using different Game of Life parameters.  The main elements are as follows. A square grid contains cells that are alive or dead. The behavior of each cell is dependent on the state of its eight immediate neighbors. Eight is an arbitrary number that Conway established to keep the model simple. These cells will strictly follow the rules.

Live Cells:

  1. A live cell with zero or one live neighbors will die
  2. A live cell with two or three live neighbors will remain alive
  3. A live cell with four or more live neighbors will die.

Dead Cells:

  1. A dead cell with exactly three live neighbors becomes alive
  2. In all other cases a dead cell will stay dead.

Thus, what his simulation led to is the determination that life is an example of emergence and self-organization. Complex patterns can emerge from the implementation of very simple rules. The game of life thus encourages the notion that “design” and “organization” can spontaneously emerge in the absence of a designer.

Stephen Wolfram introduced the concept of a Class 4 cellular automata of which the Rule of 110 is well known and widely studied. The Class 4 automata validates a lot of the thinking grounding complexity theory.  He proves that certain patterns emerge from initial conditions that are not completely random or regular but seems to hint at an order and yet the order is not predictable. Applying a simple rule repetitively to the simplest possible starting point would bode the emergence of a system that is orderly and predictable: but that is far from the truth. The resultant state is that the results exhibit some randomness and yet produce patters with order and some intelligence.

turing

 

Thus, his main conclusion from his discovery is that complexity does not have to beget complexity: simple forms following repetitive and deterministic rules can result in systems that exhibit complexity that is unexpected and unpredictable. However, he sidesteps the discussion around the level of complexity that his Class 4 automata generates. Does this determine or shed light on evolution, how human beings are formed, how cities evolve organically, how climate is impacted and how the universe undergoes change? One would argue that is not the case. However, if you take into account Darwin’s natural selection process, the Mendel’s law of selective genetics and its corresponding propitiation, the definitive steps proscribed by the Turing machine that captures time and memory,  Von Neumann’s theory of machines able to replicate themselves without any guidance, and Conway’s force de tour in proving that initial conditions without any input can create intelligent systems – you essentially can start connecting the dots to arrive at a core conclusion: higher order systems can organically create itself from initial starting conditions naturally. They exhibit a collective intelligence which is outside the boundaries of precise prediction. In the previous chapter we discussed complexity and we introduced an element of subjective assessment to how we regard what is complex and the degree of complexity. Whether complexity falls in the realm of a first-person subjective phenomenon or a scientific third-party objective phenomenon has yet to be ascertained. Yet it is indisputable that the product of a complex system might be considered a live pattern of rules acting upon agents to cause some deterministic but random variation.

Building a Lean Financial Infrastructure!

A lean financial infrastructure presumes the ability of every element in the value chain to preserve and generate cash flow. That is the fundamental essence of the lean infrastructure that I espouse. So what are the key elements that constitute a lean financial infrastructure?

And given the elements, what are the key tweaks that one must continually make to ensure that the infrastructure does not fall into entropy and the gains that are made fall flat or decay over time. Identification of the blocks and monitoring and making rapid changes go hand in hand.

lean

The Key Elements or the building blocks of a lean finance organization are as follows:

  1. Chart of Accounts: This is the critical unit that defines the starting point of the organization. It relays and groups all of the key economic activities of the organization into a larger body of elements like revenue, expenses, assets, liabilities and equity. Granularity of these activities might lead to a fairly extensive chart of account and require more work to manage and monitor these accounts, thus requiring incrementally a larger investment in terms of time and effort. However, the benefits of granularity far exceeds the costs because it forces management to look at every element of the business.
  2. The Operational Budget: Every year, organizations formulate the operational budget. That is generally a bottoms up rollup at a granular level that would map to the Chart of Accounts. It might follow a top-down directive around what the organization wants to land with respect to income, expense, balance sheet ratios, et al. Hence, there is almost always a process of iteration in this step to finally arrive and lock down the Budget. Be mindful though that there are feeders into the budget that might relate to customers, sales, operational metrics targets, etc. which are part of building a robust operational budget. var
  3. The Deep Dive into Variances: As you progress through the year and part of the monthly closing process, one would inquire about how the actual performance is tracking against the budget. Since the budget has been done at a granular level and mapped exactly to the Chart of Accounts, it thus becomes easier to understand and delve into the variances. Be mindful that every element of the Chart of Account must be evaluated. The general inclination is to focus on the large items or large variances, while skipping the small expenses and smaller variances. That method, while efficient, might not be effective in the long run to build a lean finance organization. The rule, in my opinion, is that every account has to be looked and the question should be – Why? If the management has agreed on a number in the budget, then why are the actuals trending differently. Could it have been the budget and that we missed something critical in that process? Or has there been a change in the underlying economics of the business or a change in activities that might be leading to these “unexpected variances”. One has to take a scalpel to both – favorable and unfavorable variances since one can learn a lot about the underlying drivers. It might lead to managerially doing more of the better and less of the worse. Furthermore, this is also a great way to monitor leaks in the organization. Leaks are instances of cash that are dropping out of the system. Much of little leaks amounts to a lot of cash in total, in some instances. So do not disregard the leaks. Not only will that preserve the cash but once you understand the leaks better, the organization will step up in efficiency and effectiveness with respect to cash preservation and delivery of value.  deep dive
  4. Tweak the process: You will find that as you deep dive into the variances, you might want to tweak certain processes so these variances are minimized. This would generally be true for adverse variances against the budget. Seek to understand why the variance, and then understand all of the processes that occur in the background to generate activity in the account. Once you fully understand the process, then it is a matter of tweaking this to marginally or structurally change some key areas that might favorable resonate across the financials in the future.
  5. The Technology Play: Finally, evaluate the possibilities of exploring technology to surface issues early, automate repetitive processes, trigger alerts early on to mitigate any issues later, and provide on-demand analytics. Use technology to relieve time and assist and enable more thinking around how to improve the internal handoffs to further economic value in the organization.

All of the above relate to managing the finance and accounting organization well within its own domain. However, there is a bigger step that comes into play once one has established the blocks and that relates to corporate strategy and linking it to the continual evolution of the financial infrastructure.

The essential question that the lean finance organization has to answer is – What can the organization do so that we address every element that preserves and enhances value to the customer, and how do we eliminate all non-value added activities? This is largely a process question but it forces one to understand the key processes and identify what percentage of each process is value added to the customer vs. non-value added. This can be represented by time or cost dimension. The goal is to yield as much value added activities as possible since the underlying presumption of such activity will lead to preservation of cash and also increase cash acquisition activities from the customer.

Disseminating financial knowledge to develop engaged organizations

Financial awareness of key drivers are becoming the paramount leading indicators for organizational success. For most, the finance department is a corner office service that offers ad hoc analysis on strategic and operational initiatives to a company, and provides an ex-post assessment of the financial condition of the company among a select few. There are some key financial metrics that one wants to measure across all companies and all industries without exception, but then there are unique metrics that reflect the key underlying drivers for organizational success. Organizations align their forays into new markets, new strategies and new ventures around a narrative that culminates in a financial metric or a proxy that illustrates opportunities lost or gained.

Image

Having been cast in operational finance roles for a good length of my career, I have often encountered a high level of interest to learn financial concepts in areas such as engineering, product management, operations, sales, etc. I have to admit that I have been humbled by the fairly wide common-sense understanding of basic financial concepts that these folks have. However, in most cases, the understanding is less than skin deep with misunderstandings that are meaningful. The good news is that I have also noticed a promising trend, namely … the questions are more thoroughly weighed by the “non-finance” participants, and there seems to be an elevated understanding of key financial drivers that translate to commercial success. This knowledge continues to accelerate … largely, because of convergence of areas around data science, analytics, assessment of personal ownership stakes, etc. But the passing of such information across these channels to the hungry recipients are not formalized. In other words, I posit that having a formal channel of inculcating financial education across the various functional areas would pay rich dividends for the company in the long run. Finance is a vast enough field that partaking general knowledge in these concepts which are more than merely skin-deep would also enable the finance group to engage in meaningful conversations with other functional experts, thus allowing the narrative around the numbers to be more wholesome. Thus, imparting the financial knowledge would be beneficial to the finance department as well.

Image

To be effective in creating a formal channel of disseminating information of the key areas in finance that matter to the organization, it is important to understand the operational drivers. When I say operational drivers, I am expanding that to encompass drivers that may uniquely affect other functional areas. For example, sales may be concerned with revenue, margins whereas production may be concerned with server capacity, work-in-process and throughput, etc. At the end, the financial metrics are derivatives. They are cross products of single or multiple drivers and these are the elements that need to be fleshed out to effect a spirited conversation. That would then enable the production of a financial barometer that everyone in the organization can rally behind and understand, and more importantly … be able to assess how their individual contribution has and will advance organization goals.

Aaron Swartz took down a piece of the Berlin Wall! We have to take it all down!

“The world’s entire scientific … heritage … is increasingly being digitized and locked up by a handful of private corporations… The Open Access Movement has fought valiantly to ensure that scientists do not sign their copyrights away but instead ensure their work is published on the Internet, under terms that allow anyone to access it.”  – Aaron Swartz

Information, in the context of scholarly articles by research at universities and think-tanks, is not a zero sum game. In other words, one person cannot have more without having someone have less. When you start creating “Berlin” walls in the information arena within the halls of learning, then learning itself is compromised. In fact, contributing or granting the intellectual estate into the creative commons serves a higher purpose in society – an access to information and hence, a feedback mechanism that ultimately enhances the value to the end-product itself. How? Since now the product has been distributed across a broader and diverse audience, and it is open to further critical analyses.

journals

The universities have built a racket. They have deployed a Chinese wall between learning in a cloistered environment and the world who are not immediate participants. The Guardian wrote an interesting article on this matter and a very apt quote puts it all together.

“Academics not only provide the raw material, but also do the graft of the editing. What’s more, they typically do so without extra pay or even recognition – thanks to blind peer review. The publishers then bill the universities, to the tune of 10% of their block grants, for the privilege of accessing the fruits of their researchers’ toil. The individual academic is denied any hope of reaching an audience beyond university walls, and can even be barred from looking over their own published paper if their university does not stump up for the particular subscription in question.

journal paywalls

This extraordinary racket is, at root, about the bewitching power of high-brow brands. Journals that published great research in the past are assumed to publish it still, and – to an extent – this expectation fulfils itself. To climb the career ladder academics must get into big-name publications, where their work will get cited more and be deemed to have more value in the philistine research evaluations which determine the flow of public funds. Thus they keep submitting to these pricey but mightily glorified magazines, and the system rolls on.”

http://www.guardian.co.uk/commentisfree/2012/apr/11/academic-journals-access-wellcome-trust

jstor

JSTOR is a not-for-profit organization that has invested heavily in providing an online system for archiving, accessing, and searching digitized copies of over 1,000 academic journals.  More recently, I noticed some effort on their part to allow public access to only 3 articles over a period of 21 days. This stinks! This policy reflects an intellectual snobbery beyond Himalayan proportions. The only folks that have access to these academic journals and studies are professors, and researchers that are affiliated with a university and university libraries.  Aaron Swartz noted the injustice of hoarding such knowledge and tried to distribute a significant proportion of JSTOR’s archive through one or more file-sharing sites. And what happened thereafter was perhaps one of the biggest misapplication of justice.  The same justice that disallows asymmetry of information in Wall Street is being deployed to preserve the asymmetry of information at the halls of learning.

aswartz

MSNBC contributor Chris Hayes criticized the prosecutors, saying “at the time of his death Aaron was being prosecuted by the federal government and threatened with up to 35 years in prison and $1 million in fines for the crime of—and I’m not exaggerating here—downloading too many free articles from the online database of scholarly work JSTOR.”

The Associated Press reported that Swartz’s case “highlights society’s uncertain, evolving view of how to treat people who break into computer systems and share data not to enrich themselves, but to make it available to others.”

Chris Soghioian, a technologist and policy analyst with the ACLU, said, “Existing laws don’t recognize the distinction between two types of computer crimes: malicious crimes committed for profit, such as the large-scale theft of bank data or corporate secrets; and cases where hackers break into systems to prove their skillfulness or spread information that they think should be available to the public.”

 

Kelly Caine, a professor at Clemson University who studies people’s attitudes toward technology and privacy, said Swartz “was doing this not to hurt anybody, not for personal gain, but because he believed that information should be free and open, and he felt it would help a lot of people.”

And then there were some modest reservations, and Swartz actions were attributed to reckless judgment. I contend that this does injustice to someone of Swartz’s commitment and intellect … the recklessness was his inability to grasp the notion that an imbecile in the system would pursue 35 years of imprisonment and $1M fine … it was not that he was not aware of what he was doing but he believed, as does many, that scholarly academic research should be available as a free for all.

We have a Berlin wall that needs to be taken down. Swartz started that but he was unable to keep at it. It is important to not rest in this endeavor and that everyone ought to actively petition their local congressman to push bills that will allow open access to these academic articles.

John Maynard Keynes had warned of the folly of “shutting off the sun and the stars because they do not pay a dividend”, because what is at stake here is the reach of the light of learning. Aaron was at the vanguard leading that movement, and we should persevere to become those points of light that will enable JSTOR to disseminate the information that they guard so unreservedly.

 

 

 

 

 

Introduce Culture into Product Development

All products go through a life-cycle. However, the genius of an organization lies in how to manage the life-cycle of the product and extend it as necessary to serve the customers. Thus, it is not merely the wizardry in technology and manufacturing that determine the ultimate longevity of the product in the market and the mind share of the customer. The product has to respond to the diversity of demands determined by disposable income, demographics, geography, etc. In business school speak, we say that this is part of market segmentation coupled with the appropriate marketing message. However, there is not an explicit strategy formulated around identifying

  1. Corporate Culture
  2. Extended Culture

To achieve success, firms increasingly must develop products by leveraging ad coordinating broad creative capabilities and resources, which often are diffused across geographical and cultural boundaries. But what we have to explore is a lot more than that from the incipient stages that a product has imagined: How do we instill unique corporate DNA into the product that immediately marks the product with a corporate signature? In addition, how do we built out a product that is tenable across the farthest reaches of geography and cultural diversity?



toys

Thus, an innovative approach is called for in product development … particularly, in a global context. The approach entails getting cross-disciplinary teams in liberal arts, science, business, etc. to work together to gather deeper insights into the cultural strains that drive decisions in various markets. To reiterate, there is no one particular function that is paramount: all of them have to work and improvise together while ensuring that there are channels that gather feedback. The cross disciplinary team and the institutionalization of a feedback mechanism that can be quickly acted upon are the key parameters to ensure that the right product is in the market and that it will be extended accordingly to the chatter of the crowds.

ted

Having said that, this is hardly news! A lot of companies are well on their way to instill these factors into product design and development. Companies have created organizational architectures in the corporate structure in a manner that culturally appropriate products are developed and maintained in dispersed local markets. However, in most instances, we have also seen that the way they view this is to have local managers run the show, with the presumption that these “culturally appropriate” products will make good in those markets. But along the way, the piece that dissembles over time on account of creating the local flavor is that the product may not mirror the culture that the corporate group wants to instill. If these two are not aptly managed and balanced, islands of conflict will be created. Thus, my contention is that a top-down value mandate ought to set the appropriate parameters inside which the hotbed of collaborative activity would take place for product design and development in various markets.

maps

Thus the necessary top down value systems that would bring culture into products would be:

  1. Open areas for employees to express their thoughts and ideas
  2. Diversity of people with different skill sets in product teams will contribute to product development
  3. Encouraging internal and external speakers to expound upon the product touch points in the community.
  4. Empowerment and recognition systems.
  5. Proper formulation of monetary incentives to inspire and maintain focus.

 

Darkness at Noon in Facebook!

Facebook began with a simple thesis: Connect Friends. That was the sine qua non of its existence. From a simple thesis to an effective UI design, Facebook has grown over the years to become the third largest community in the world. But as of the last few years they have had to resort to generating revenue to meet shareholder expectations. Today it is noon at Facebook but there is the long shadow of darkness that I posit have fallen upon perhaps one of the most influential companies in history.

dk at noon

The fact is that leaping from connecting friends to managing the conversations allows Facebook to create this petri dish to understand social interactions at large scale eased by their fine technology platform. To that end, they are moving into alternative distribution channels to create broader reach into global audience and to gather deeper insights into the interaction templates of the participants. The possibilities are immense: in that, this platform can be a collaborative beachhead into discoveries, exploration, learning, education, social and environmental awareness and ultimately contribute to elevated human conscience.  But it has faltered, perhaps the shareholders and the analysts are much to blame, on account of  the fangled existence of market demands and it has become one global billboard for advertisers to promote their brands. Darkness at noon is the most appropriate metaphor to reflect Facebook as it is now.

petridish

Let us take a small turn to briefly look at some of other very influential companies that have not been as much derailed as has Facebook. The companies are Twitter, Google and LinkedIn. Each of them are the leaders in their category, and all of them have moved toward monetization schemes from their specific user base. Each of them has weighed in significantly in their respective categories to create movements that have or will affect the course of the future. We all know how Twitter has contributed to super-fast news feeds globally that have spontaneously generated mass coalescence around issues that make a difference; Google has been an effective tool to allow an average person to access information; and LinkedIn has created professional and collaborative environment in the professional space. Thus, all three of these companies, despite supplementing fully their appetite for revenue through advertising, have not compromised their quintessence for being. Now all of these companies can definitely move their artillery to encompass the trajectory of FB but that would be a steep hill to climb. Furthermore, these companies have an aura associated within their categories: attempts to move out of their category have been feeble at best, and in some instances, not successful. Facebook has a phenomenal chance of putting together what they have to create a communion of knowledge and wisdom. And no company exists in the market better suited to do that at this point.

crowdsource

One could counter that Facebook sticks to its original vision and that what we have today is indeed what Facebook had planned for all along since the beginning. I don’t disagree. My point of contention in this matter is that though is that Facebook has created this informal and awesome platform for conversations and communities among friends, it has glossed over the immense positive fallout that could occur as a result of these interactions. And that is the development and enhancement of knowledge, collaboration, cultural play, encourage a diversity of thought, philanthropy, crowd sourcing scientific and artistic breakthroughs, etc. In other words, the objective has been met for the most part. Thank you Mark! Now Facebook needs to usher in a renaissance in the courtyard. Facebook needs to find a way out of the advertising morass that has shed darkness over all the product extensions and launches that have taken place over the last 2 years: Facebook can force a point of inflection to quadruple its impact on the course of history and knowledge. And the revenue will follow!

Why Jugglestars? How will this benefit you?

Consider this. Your professional career is a series of projects. Employers look for accountability and performance, and they measure you by how you fare on your projects. Everything else, for the most part, is white noise. The projects you work on establish your skill set and before long – your career trajectory.  However, all the great stuff that you have done at work is for the most part hidden from other people in your company or your professional colleagues. You may get a recommendation on LinkedIn, which is fairly high-level, or you may receive endorsements for your skills, which is awesome. But the Endorsements on LinkedIn seem a little random, don’t they?  Wouldn’t it be just awesome to recognize, or be recognized by, your colleagues for projects that you have worked on. We are sure that there are projects that you have worked on that involves third-party vendors, consultants, service providers, clients, etc. – well, now you have a forum to send and receive recognition, in a beautiful form factor, that you can choose to display across your networks.

project

Imagine an employee review. You must have spent some time thinking through all the great stuff that you have done that you want to attach to your review form. And you may have, in your haste, forgotten some of the great stuff that you have done and been recognized for informally. So how cool would it be to print or email all the projects that you’ve worked on and the recognition you’ve received to your manager? How cool would it be to send all the people that you have recognized for their phenomenal work? For in the act of participating in the recognition ecosystem that our application provides you – you are an engaged and prized employee that any company would want to retain, nurture and develop.

crowd

 

Now imagine you are looking for a job. You have a resume. That is nice. And then the potential employer or recruiter is redirected to your professional networks and they have a glimpse of your recommendations and skill sets. That is nice too! But seriously…wouldn’t it be better for the hiring manager or recruiter to have a deeper insight into some of the projects that you have done and the recognition that you have received? Wouldn’t it be nice for them to see how active you are in recognizing great work of your other colleagues and project co-workers?  Now they would have a more comprehensive idea of who you are and what makes you tick.

3600

We help you build your professional brand and convey your accomplishments. That translates into greater internal development opportunities in your company, promotion, increase in pay, and it also makes you more marketable.  We help you connect to high-achievers and forever manage your digital portfolio of achievements that can, at your request, exist in an open environment.  JuggleStars.com is a great career management tool.

Check out www.jugglestars.com

pic homepage

.

The Unbearable Lightness of Being

Where the mind is without fear and the head is held high
Where knowledge is free
Where the world has not been broken up into fragments
By narrow domestic walls
Where words come out from the depth of truth
Where tireless striving stretches its arms towards perfection
Where the clear stream of reason has not lost its way
Into the dreary desert sand of dead habit
Where the mind is led forward by thee
Into ever-widening thought and action
Into that heaven of freedom, my Father, let my country awake.

–        Rabindranath  Tagore

Among the many fundamental debates in philosophy, one of the fundamental debates has been around the concept of free will. The debates have stemmed around two arguments associated with free will.

1)      Since future actions are governed by the circumstances of the present and the past, human beings future actions are predetermined on account of the learnings from the past.  Hence, the actions that happen are not truly a consequent of free will.

2)      The counter-argument is that future actions may not necessarily be determined and governed by the legacy of the present and the past, and hence leaves headroom for the individual to exercise free will.

Now one may wonder what determinism or lack of it has anything to do with the current state of things in an organizational context.  How is this relevant? Why are the abstract notions of determinism and free will important enough to be considered in the context of organizational evolution?  How does the meaning lend itself to structured institutions like business organizations, if you will, whose sole purpose is to create products and services to meet the market demand.

So we will throw a factual wrinkle in this line of thought. We will introduce now an element of chance. How does chance change the entire dialectic? Simply because chance is an unforeseen and random event that may not be pre-determined; in fact, a chance event may not have a causal trigger. And chance or luck could be meaningful enough to untether an organization and its folks to explore alternative paths.  It is how the organization and the people are aligned to take advantage of that random nondeterministic future that could make a huge difference to the long term fate of the organization.

The principle of inductive logic states that what is true for n and n+1 would be true for n+2.  The inductive logic creates predictability and hence organizations create pathways to exploit the logical extension of inductive logic. It is the most logical apparatus that exists to advance groups in a stable but robust manner to address the multitude of challenges that that they have to grapple with. After all, the market is governed by animal spirits! But let us think through this very carefully.  All competition or collaboration that occurs among groups to address the market demands result in homogenous behavior with general homogeneous outcomes.  Simply put, products and services become commoditized. Their variance is not unique and distinctive.  However, they could be just be distinctive enough to eke out enough profits in the margins before being absorbed into a bigger whole. At that point, identity is effaced over time.  Organizations gravitate to a singularity.  Unique value propositions wane over time.

So let us circle back to chance.  Chance is our hope to create divergence. Chance is the factoid that cancels out the inductive vector of industrial organization. Chance does not exist … it is not a “waiting for Godot” metaphor around the corner.  If it always did, it would have been imputed by the determinists in their inductive world and we would end up with a dystopian homogenous future.  Chance happens.  And sometimes it has a very short half-life. And if the organization and people are aligned and their mindset is adapted toward embracing and exploiting that fleeting factoid of chance, the consequences could be huge.  New models would emerge, new divergent paths would be traduced and society and markets would burst into a garden of colorful ideas in virtual oasis of new markets.

So now to tie this all to free will and to the unbearable lightness of being! It is the existence of chance that creates the opportunity to exercise free will on the part of an individual, but it is the organizations responsibility to allow the individual to unharness themselves from organization inertia. Thus, organizations have to perpetuate an environment wherein employees are afforded some headroom to break away.  And I don’t mean break away as in people leaving the organization to do their own gigs; I mean breakaway in thought and action within the boundaries of the organization to be open to element of chance and exploit it. Great organizations do not just encourage the lightness of being … unharnessing the talent but rather – the great organizations are the ones that make the lightness of being unbearable.  These individuals are left with nothing but an awareness and openness to chance to create incredible values … far more incredible and awe inspiring and momentous than a more serene state of general business as usual affairs.