Category Archives: Innovation
The whole is greater than the sum of its parts. “Emergent properties” refer to those properties that emerge that might be entirely unexpected. As discussed in CAS, they arise from the collaborative functioning of a system. In other words, emergent properties are properties of a group of items, but it would be erroneous for us to reduce such systems into properties of atomic elements and use those properties as binding elements to understand emergence Some common examples of emergent properties include cities, bee hives, ant colonies and market systems. Out thinking attributes causal effects – namely, that behavior of elements would cause certain behaviors in other hierarchies and thus an entity emerges at a certain state. However, we observe that a process of emergence is the observation of an effect without an apparent cause. Yet it is important to step back and regard the relationships and draw lines of attribution such that one can concur that there is an impact of elements at the lowest level that surfaces, in some manner, at the highest level which is the subject of our observation.
Jochenn Fromm in his paper “Types and Forms of Emergence” has laid this out best. He says that emergent properties are “amazing and paradox: fundamental but familiar.” In other words, emergent properties are changeless and changing, constant and fluctuating, persistent and shifting, inevitable and unpredictable. The most important note that he makes is that the emergent property is part of the system and at the same time it might not always be a part of the system. There is an undercurrent of novelty or punctuated gaps that might arise that is inexplicable, and it is this fact that renders true emergence virtually irreducible. Thus, failure is embodied in all emergent systems – failure being that the system does not behave according to expectation. Despite all rules being followed and quality thresholds are established at every toll gate at the highest level, there is still a possibility of failure which suggests that there is some missing information in the links. It is also possible that the missing information is dynamic – you do not step in the same water twice – which makes the study to predict emergent systems to be a rather difficult exercise. Depending on the lens through which we look at, the system might appear or disappear.
There are two types of emergence: Descriptive and Explanatory emergence. Descriptive emergence means that properties of wholes cannot be necessarily defined through the properties of the pasts. Explanatory emergence means laws of complex systems cannot be deduced from the laws of interaction of simpler elements that constitute it. Thus the emergence is a result of the amount of variety embodied in the system, the amount of external influence that weights and shapes the overall property and direction of the system, the type of resources that the system consumes, the type of constraints that the system is operating under and the number of levels of sub-systems that work together to build out the final system. Thus, systems can be benign as in the system is relatively more predictable whereas a radical system is a material departure of a system from expectation. If the parts that constitute a system is independent of its workings from other parts and can be boxed within boundaries, emergent systems become more predictable. A watch is an example of a system that follows the different mechanical elements in a watch that are geared for reading the time as it ultimate purpose. It is a good example of a complex physical system. However, these systems are very brittle – a failure in one point can cascade into a failure of the entire system. Systems that are more resilient are those where the elements interact and learn from one another. In other words, the behavior of the elements excites other elements – all of which work together to create a dance toward a more stable state. They deploy what is often called the flocking trick and the pheromone trick. Flocking trick is largely the emulation of the particles that are close to each other – very similar to the cellular automata as introduced by Neumann and discussed in the earlier chapter. The Pheromone trick reflects how the elements leave marks that are acted upon as signals by other elements and thus they all work together around these signal trails to behave and thus act as a forcing function to create the systems.
There are systems that have properties of extremely strong emergence. What does Consciousness, Life, and Culture have in common? How do we look at Climate? What about the organic development of cities? These are just some examples of system where determinism is nigh impossible. We might be able to tunnel through the various and diverse elements that embody the system, but it would be difficult to coherently and tangibly draw all set of relationships, signals, effectors and detectors, etc. to grapple with a complete understanding of the system. Wrestling a strong emergent system would be a task that might even be outside the purview of the highest level of computational power available. And yet, these systems exist, and they emerge and evolve. Yet we try to plan for these systems or plan to direct policies to influence the system, not fully knowing the impact. This is also where the unintended consequences of our action might take free rein.
There are two models in complexity. Complex Physical Systems and Complex Adaptive Systems! For us to grasp the patterns that are evolving, and much of it seemingly out of our control – it is important to understand both these models. One could argue that these models are mutually exclusive. While the existing body of literature might be inclined toward supporting that argument, we also find some degree of overlap that makes our understanding of complexity unstable. And instability is not to be construed as a bad thing! We might operate in a deterministic framework, and often, we might operate in the realms of a gradient understanding of volatility associated with outcomes. Keeping this in mind would be helpful as we deep dive into the two models. What we hope is that our understanding of these models would raise questions and establish mental frameworks for intentional choices that we are led to make by the system or make to influence the evolution of the system.
Complex Physical Systems (CPS)
Complex Physical Systems are bounded by certain laws. If there are initial conditions or elements in the system, there is a degree of predictability and determinism associated with the behavior of the elements governing the overarching laws of the system. Despite the tautological nature of the term (Complexity Physical System) which suggests a physical boundary, the late 1900’s surfaced some nuances to this model. In other words, if there is a slight and an arbitrary variation in the initial conditions, the outcome could be significantly different from expectations. The assumption of determinism is put to the sword. The notion that behaviors will follow established trajectories if rules are established and the laws are defined have been put to test. These discoveries by introspection offers an insight into the developmental block of complex physical systems and how a better understanding of it will enable us to acknowledge such systems when we see it and thereafter allow us to establish certain toll-gates and actions to navigate, to the extent possible, to narrow the region of uncertainty around outcomes.
The universe is designed as a complex physical system. Just imagine! Let this sink in a bit. A complex physical system might be regarded relatively simpler than a complex adaptive system. And with that in mind, once again …the universe is a complex physical system. We are awed by the vastness and scale of the universe, we regard the skies with an illustrious reverence and we wonder and ruminate on what lies beyond the frontiers of a universe, if anything. Really, there is nothing bigger than the universe in the physical realm and yet we regard it as a simple system. A “Simple” Complex Physical System. In fact, the behavior of ants that lead to the sustainability of an ant colony, is significantly more complex: and we mean by orders of magnitude.
Complexity behavior in nature reflects the tendency of large systems with many components to evolve into a poised “critical” state where minor disturbances or arbitrary changes in initial conditions can create a seemingly catastrophic impact on the overall system such that system changes significantly. And that happens not by some invisible hand or some uber design. What is fundamental to understanding complex systems is to understand that complexity is defined as the variability of the system. Depending on our lens, the scale of variability could change and that might lead to different apparatus that might be required to understand the system. Thus, determinism is not the measure: Stephen Jay Gould has argued that it is virtually impossible to predict the future. We have hindsight explanatory powers but not predictable powers. Hence, systems that start from the initial state over time might represent an outcome that is distinguishable in form and content from the original state. We see complex physical systems all around us. Snowflakes, patterns on coastlines, waves crashing on a beach, rain, etc.
Complex Adaptive Systems (CAS)
Complex adaptive systems, on the contrary, are learning systems that evolve. They are composed of elements which are called agents that interact with one another and adapt in response to the interactions.
Markets are a good example of complex adaptive systems at work.
CAS agents have three levels of activity. As described by Johnson in Complexity Theory: A Short Introduction – the three levels of activity are:
- Performance (moment by moment capabilities): This establishes the locus of all behavioral elements that signify the agent at a given point of time and thereafter establishes triggers or responses. For example, if an object is approaching and the response of the agent is to run, that would constitute a performance if-then outcome. Alternatively, it could be signals driven – namely, an ant emits a certain scent when it finds food: other ants will catch on that trail and act, en masse, to follow the trail. Thus, an agent or an actor in an adaptive system has detectors which allows them to capture signals from the environment for internal processing and it also has the effectors that translate the processing to higher order signals that influence other agents to behave in certain ways in the environment. The signal is the scent that creates these interactions and thus the rubric of a complex adaptive system.
- Credit assignment (rating the usefulness of available capabilities): When the agent gathers experience over time, the agent will start to rely heavily on certain rules or heuristics that they have found useful. It is also typical that these rules may not be the best rules, but it could be rules that are a result of first discovery and thus these rules stay. Agents would rank these rules in some sequential order and perhaps in an ordinal ranking to determine what is the best rule to fall back on under certain situations. This is the crux of decision making. However, there are also times when it is difficult to assign a rank to a rule especially if an action is setting or laying the groundwork for a future course of other actions. A spider weaving a web might be regarded as an example of an agent expending energy with the hope that she will get some food. This is a stage setting assignment that agents have to undergo as well. One of the common models used to describe this best is called the bucket-brigade algorithm which essentially states that the strength of the rule depends on the success of the overall system and the agents that constitute it. In other words, all the predecessors and successors need to be aware of only the strengths of the previous and following agent and that is done by some sort of number assignment that becomes stronger from the beginning of the origin of the system to the end of the system. If there is a final valuable end-product, then the pathway of the rules reflect success. Once again, it is conceivable that this might not be the optimal pathway but a satisficing pathway to result in a better system.
- Rule discovery (generating new capabilities): Performance and credit assignment in agent behavior suggest that the agents are governed by a certain bias. If the agents have been successful following certain rules, they would be inclined toward following those rules all the time. As noted, rules might not be optimal but satisficing. Is improvement a matter of just incremental changes to the process? We do see major leaps in improvement … so how and why does this happen? In other words, someone in the process have decided to take a different rule despite their experiences. It could have been an accident or very intentional.
One of the theories that have been presented is that of building blocks. CAS innovation is a result of reconfiguring the various components in new ways. One quips that if energy is neither created, nor destroyed …then everything that exists today or will exist tomorrow is nothing but a reconfiguration of energy in new ways. All of tomorrow resides in today … just patiently waiting to be discovered. Agents create hypotheses and experiment in the petri dish by reconfiguring their experiences and other agent’s experiences to formulate hypotheses and the runway for discovery. In other words, there is a collaboration element that comes into play where the interaction of the various agents and their assignment as a group to a rule also sets the stepping stone for potential leaps in innovation.
Another key characteristic of CAS is that the elements are constituted in a hierarchical order. Combinations of agents at a lower level result in a set of agents higher up and so on and so forth. Thus, agents in higher hierarchical orders take on some of the properties of the lower orders but it also includes the interaction rules that distinguishes the higher order from the lower order.
Complexity theory began in the 1930’s when natural scientists and mathematicians rallied together to get a deeper understanding of how systems emerge and plays out over time. However, the groundwork of complexity theory began in the 1850’s with Darwin’s introduction to Natural Selection. It was further extended by Mendel’s genetic algorithms. Darwin’s Theory of Evolution has been posited as a slow gradual process. He says that “Natural selection acts only by taking advantage of slight successive variations; she can never take a great and sudden leap, but must advance by short and sure, though slow steps.” Thus, he concluded that complex systems evolve by leaps and the result is an organic formulation of an irreducibly complex system which is composed of many parts, all of which work together closely for the overall system to function. If any part is missing or does not act as expected, then the system becomes unwieldy and breaks down. So it was an early foray into distinguishing the emergent property of a system from the elements that constitute it. Mendel, on the other hand, laid out the property of inheritance across generations. An organic system inherits certain traits that are reconfigured over time and adapts to the environment, thus leading to the development of an organism which for our purposes fall in the realm of a complex outcome. One would imagine that there is a common thread between Darwin’s Natural Selection and Mendel’s laws of genetic inheritance. But that is not the case and that has wide implications in complexity theory. Mendel focused on how the traits are carried across time: the mechanics which are largely determined by some probabilistic functions. The underlying theory of Mendel hinted at the possibility that a complex system is a result of discrete traits that are passed on while Darwin suggests that complexity arises due continuous random variations.
In the 1920’s, literature suggested that a complex system has elements of both: continuous adaptation and discrete inheritance that is hierarchical in nature. A group of biologists reconciled the theories into what is commonly known as the Modern Synthesis. The principles guiding Modern Synthesis were: Natural Selection was the major mechanism for evolutionary change. Small random variations of genes and natural selection result in the origin of new species. Furthermore, the new species might have properties different than the elements that constitute. Modern Synthesis thus provided the framework around Complexity theory. What does this great debate mean for our purposes? Once we arrive at determining whether a system is complex, then how does the debate shed more light into our understanding of complexity. Does this debate shed light into how we regard complexity and how we subsequently deal with it? We need to further extend our thinking by looking at a few new developments that occurred in the 20th century that would give us a better perspective. Let us then continue our journey into the evolution of the thinking around complexity.
Axioms are statements that are self-evident. It serves to be a premise or starting point for further reasoning and arguments. An axiom thus is not contestable because if it, then all the following reasoning that is extended against axioms would fall apart. Thus, for our purposes and our understanding of complexity theory – A complex system has an initial state that is irreducible physically or mathematically.
One of the key elements in Complexity is computation or computability. In the 1930’s, Turing introduced the abstract concept of the Turing machine. There is a lot of literature that goes into the specifics of how the machine works but that is beyond the scope of this book. However, there are key elements that can be gleaned from that concept to better understand complex systems. A complex system that evolves is a result of a finite number of steps that would solve a specific challenge. Although the concept has been applied in the boundaries of computational science, I am taking the liberty to apply this to emerging complex systems. Complexity classes help scientists categorize the problems based on how much time and space is required to solve problems and verify solutions. The complexity is thus a function of time and memory. This is a very important concept and we have radically simplified the concept to attend to a self-serving purpose: understand complexity and how to solve the grand challenges? Time complexity refers to the number of steps required to solve a problem. A complex system might not necessarily be the most efficient outcome but is nonetheless an outcome of a series of steps, backward and forward to result in a final state. There are pathways or efficient algorithms that are produced and the mechanical states to produce them are defined and known. Space complexity refers to how much memory that the algorithm depends on to solve the problem. Let us keep these concepts in mind as we round this all up into a more comprehensive work that we will relay at the end of this chapter.
Around the 1940’s, John von Neumann introduced the concept of self-replicating machines. Like Turing, Von Neumann’s would design an abstract machine which, when run, would replicate itself. The machine consists of three parts: a ‘blueprint’ for itself, a mechanism that can read any blueprint and construct the machine (sans blueprint) specified by that blueprint, and a ‘copy machine’ that can make copies of any blueprint. After the mechanism has been used to construct the machine specified by the blueprint, the copy machine is used to create a copy of that blueprint, and this copy is placed into the new machine, resulting in a working replication of the original machine. Some machines will do this backwards, copying the blueprint and then building a machine. The implications are significant. Can complex systems regenerate? Can they copy themselves and exhibit same behavior and attributes? Are emergent properties equivalent? Does history repeat itself or does it rhyme? How does this thinking move our understanding and operating template forward once we identify complex systems?
Let us step forward into the late 1960’s when John Conway started doing experiments extending the concept of the cellular automata. He introduced the concept of the Game of Life in 1970 as a result of his experiments. His main theses was simple : The game is a zero-player game, meaning that its evolution is determined by its initial state, requiring no further input. One interacts with the Game of Life by creating an initial configuration and observing how it evolves, or, for advanced players, by creating patterns with properties. The entire formulation was done on a two-dimensional universe in which patterns evolved over time. It is one of the finest examples in science of how a set of few simple non-arbitrary rules can result in an incredibly complex behavior that is fluid and provides a pleasing pattern over time. In other words, if one were an outsider looking in, you would see a pattern emerging from simple initial states and simple rules. We encourage you to look at several patterns that many people have constructed using different Game of Life parameters. The main elements are as follows. A square grid contains cells that are alive or dead. The behavior of each cell is dependent on the state of its eight immediate neighbors. Eight is an arbitrary number that Conway established to keep the model simple. These cells will strictly follow the rules.
- A live cell with zero or one live neighbors will die
- A live cell with two or three live neighbors will remain alive
- A live cell with four or more live neighbors will die.
- A dead cell with exactly three live neighbors becomes alive
- In all other cases a dead cell will stay dead.
Thus, what his simulation led to is the determination that life is an example of emergence and self-organization. Complex patterns can emerge from the implementation of very simple rules. The game of life thus encourages the notion that “design” and “organization” can spontaneously emerge in the absence of a designer.
Stephen Wolfram introduced the concept of a Class 4 cellular automata of which the Rule of 110 is well known and widely studied. The Class 4 automata validates a lot of the thinking grounding complexity theory. He proves that certain patterns emerge from initial conditions that are not completely random or regular but seems to hint at an order and yet the order is not predictable. Applying a simple rule repetitively to the simplest possible starting point would bode the emergence of a system that is orderly and predictable: but that is far from the truth. The resultant state is that the results exhibit some randomness and yet produce patters with order and some intelligence.
Thus, his main conclusion from his discovery is that complexity does not have to beget complexity: simple forms following repetitive and deterministic rules can result in systems that exhibit complexity that is unexpected and unpredictable. However, he sidesteps the discussion around the level of complexity that his Class 4 automata generates. Does this determine or shed light on evolution, how human beings are formed, how cities evolve organically, how climate is impacted and how the universe undergoes change? One would argue that is not the case. However, if you take into account Darwin’s natural selection process, the Mendel’s law of selective genetics and its corresponding propitiation, the definitive steps proscribed by the Turing machine that captures time and memory, Von Neumann’s theory of machines able to replicate themselves without any guidance, and Conway’s force de tour in proving that initial conditions without any input can create intelligent systems – you essentially can start connecting the dots to arrive at a core conclusion: higher order systems can organically create itself from initial starting conditions naturally. They exhibit a collective intelligence which is outside the boundaries of precise prediction. In the previous chapter we discussed complexity and we introduced an element of subjective assessment to how we regard what is complex and the degree of complexity. Whether complexity falls in the realm of a first-person subjective phenomenon or a scientific third-party objective phenomenon has yet to be ascertained. Yet it is indisputable that the product of a complex system might be considered a live pattern of rules acting upon agents to cause some deterministic but random variation.
“The world’s entire scientific … heritage … is increasingly being digitized and locked up by a handful of private corporations… The Open Access Movement has fought valiantly to ensure that scientists do not sign their copyrights away but instead ensure their work is published on the Internet, under terms that allow anyone to access it.” – Aaron Swartz
Information, in the context of scholarly articles by research at universities and think-tanks, is not a zero sum game. In other words, one person cannot have more without having someone have less. When you start creating “Berlin” walls in the information arena within the halls of learning, then learning itself is compromised. In fact, contributing or granting the intellectual estate into the creative commons serves a higher purpose in society – an access to information and hence, a feedback mechanism that ultimately enhances the value to the end-product itself. How? Since now the product has been distributed across a broader and diverse audience, and it is open to further critical analyses.
The universities have built a racket. They have deployed a Chinese wall between learning in a cloistered environment and the world who are not immediate participants. The Guardian wrote an interesting article on this matter and a very apt quote puts it all together.
“Academics not only provide the raw material, but also do the graft of the editing. What’s more, they typically do so without extra pay or even recognition – thanks to blind peer review. The publishers then bill the universities, to the tune of 10% of their block grants, for the privilege of accessing the fruits of their researchers’ toil. The individual academic is denied any hope of reaching an audience beyond university walls, and can even be barred from looking over their own published paper if their university does not stump up for the particular subscription in question.
This extraordinary racket is, at root, about the bewitching power of high-brow brands. Journals that published great research in the past are assumed to publish it still, and – to an extent – this expectation fulfils itself. To climb the career ladder academics must get into big-name publications, where their work will get cited more and be deemed to have more value in the philistine research evaluations which determine the flow of public funds. Thus they keep submitting to these pricey but mightily glorified magazines, and the system rolls on.”
JSTOR is a not-for-profit organization that has invested heavily in providing an online system for archiving, accessing, and searching digitized copies of over 1,000 academic journals. More recently, I noticed some effort on their part to allow public access to only 3 articles over a period of 21 days. This stinks! This policy reflects an intellectual snobbery beyond Himalayan proportions. The only folks that have access to these academic journals and studies are professors, and researchers that are affiliated with a university and university libraries. Aaron Swartz noted the injustice of hoarding such knowledge and tried to distribute a significant proportion of JSTOR’s archive through one or more file-sharing sites. And what happened thereafter was perhaps one of the biggest misapplication of justice. The same justice that disallows asymmetry of information in Wall Street is being deployed to preserve the asymmetry of information at the halls of learning.
MSNBC contributor Chris Hayes criticized the prosecutors, saying “at the time of his death Aaron was being prosecuted by the federal government and threatened with up to 35 years in prison and $1 million in fines for the crime of—and I’m not exaggerating here—downloading too many free articles from the online database of scholarly work JSTOR.”
The Associated Press reported that Swartz’s case “highlights society’s uncertain, evolving view of how to treat people who break into computer systems and share data not to enrich themselves, but to make it available to others.”
Chris Soghioian, a technologist and policy analyst with the ACLU, said, “Existing laws don’t recognize the distinction between two types of computer crimes: malicious crimes committed for profit, such as the large-scale theft of bank data or corporate secrets; and cases where hackers break into systems to prove their skillfulness or spread information that they think should be available to the public.”
Kelly Caine, a professor at Clemson University who studies people’s attitudes toward technology and privacy, said Swartz “was doing this not to hurt anybody, not for personal gain, but because he believed that information should be free and open, and he felt it would help a lot of people.”
And then there were some modest reservations, and Swartz actions were attributed to reckless judgment. I contend that this does injustice to someone of Swartz’s commitment and intellect … the recklessness was his inability to grasp the notion that an imbecile in the system would pursue 35 years of imprisonment and $1M fine … it was not that he was not aware of what he was doing but he believed, as does many, that scholarly academic research should be available as a free for all.
We have a Berlin wall that needs to be taken down. Swartz started that but he was unable to keep at it. It is important to not rest in this endeavor and that everyone ought to actively petition their local congressman to push bills that will allow open access to these academic articles.
John Maynard Keynes had warned of the folly of “shutting off the sun and the stars because they do not pay a dividend”, because what is at stake here is the reach of the light of learning. Aaron was at the vanguard leading that movement, and we should persevere to become those points of light that will enable JSTOR to disseminate the information that they guard so unreservedly.
All products go through a life-cycle. However, the genius of an organization lies in how to manage the life-cycle of the product and extend it as necessary to serve the customers. Thus, it is not merely the wizardry in technology and manufacturing that determine the ultimate longevity of the product in the market and the mind share of the customer. The product has to respond to the diversity of demands determined by disposable income, demographics, geography, etc. In business school speak, we say that this is part of market segmentation coupled with the appropriate marketing message. However, there is not an explicit strategy formulated around identifying
- Corporate Culture
- Extended Culture
To achieve success, firms increasingly must develop products by leveraging ad coordinating broad creative capabilities and resources, which often are diffused across geographical and cultural boundaries. But what we have to explore is a lot more than that from the incipient stages that a product has imagined: How do we instill unique corporate DNA into the product that immediately marks the product with a corporate signature? In addition, how do we built out a product that is tenable across the farthest reaches of geography and cultural diversity?
Thus, an innovative approach is called for in product development … particularly, in a global context. The approach entails getting cross-disciplinary teams in liberal arts, science, business, etc. to work together to gather deeper insights into the cultural strains that drive decisions in various markets. To reiterate, there is no one particular function that is paramount: all of them have to work and improvise together while ensuring that there are channels that gather feedback. The cross disciplinary team and the institutionalization of a feedback mechanism that can be quickly acted upon are the key parameters to ensure that the right product is in the market and that it will be extended accordingly to the chatter of the crowds.
Having said that, this is hardly news! A lot of companies are well on their way to instill these factors into product design and development. Companies have created organizational architectures in the corporate structure in a manner that culturally appropriate products are developed and maintained in dispersed local markets. However, in most instances, we have also seen that the way they view this is to have local managers run the show, with the presumption that these “culturally appropriate” products will make good in those markets. But along the way, the piece that dissembles over time on account of creating the local flavor is that the product may not mirror the culture that the corporate group wants to instill. If these two are not aptly managed and balanced, islands of conflict will be created. Thus, my contention is that a top-down value mandate ought to set the appropriate parameters inside which the hotbed of collaborative activity would take place for product design and development in various markets.
Thus the necessary top down value systems that would bring culture into products would be:
- Open areas for employees to express their thoughts and ideas
- Diversity of people with different skill sets in product teams will contribute to product development
- Encouraging internal and external speakers to expound upon the product touch points in the community.
- Empowerment and recognition systems.
- Proper formulation of monetary incentives to inspire and maintain focus.
Posted in Corporate Social Responsibility, Employee Engagement, Employee retention, Extrinsic Rewards, Innovation, Intrinsic Rewards, Leadership, Learning Organization, Learning Process, Organization Architecture, Product Design, Recognition, Rewards
LinkedIn endorsements have no value. So says many pundits! Here are some interesting articles that speaks of the uselessness of this product feature in LinkedIn.
I have some opinions on this matter. I started a company last year that allows people within and outside of the company to recommend professionals based on projects. We have been ushered into a world where our jobs, for the most part, constitute a series of projects that are undertaken over the course of a person’s career. The recognition system around this granular element is lacking; we have recommendations and recognition systems that have been popularized by LinkedIn, Kudos, Rypple, etc. But we have not seen much development in tools that address recognition around projects in the public domain. I foresee the possibility of LinkedIn getting into this space soon. Why? It is simple. The answer is in their “useless” Endorsement feature that has been on since late last year. As of March 13, one billion endorsements have been given to 56 million LinkedIn members, an average of about 4 per person. What does this mean? It means that LinkedIn has just validated a potential feature which will add more flavor to the endorsements – Why have you granted these endorsements in the first place?
Thus, it stands to reason the natural step is to reach out to these endorsers by providing them appropriate templates to add more flavor to the endorsements. Doing so will force a small community of the 56 million participants to add some flavor. Even if that constitutes 10%, that is almost 5.6M members who are contributing to this feature. Now how many products do you know that release one feature and very quickly gather close to six million active participants to use it? In addition, this would only gain force since more and more people would use this feature and all of a sudden … the endorsements become a beachhead into a very strategic product.
The other area that LinkedIn will probably step into is to catch the users young. Today it happens to be professionals; I will not be surprised if they start moving into the university/college space and what is a more effective way to bridge than to position a product that recognizes individuals against projects the individuals have collaborated on.
LinkedIn and Facebook are two of the great companies of our time and they are peopled with incredibly smart people. So what may seemingly appear as a great failure in fact will become the enabler of a successful product that will significantly increase the revenue streams of LinkedIn in the long run!
Facebook began with a simple thesis: Connect Friends. That was the sine qua non of its existence. From a simple thesis to an effective UI design, Facebook has grown over the years to become the third largest community in the world. But as of the last few years they have had to resort to generating revenue to meet shareholder expectations. Today it is noon at Facebook but there is the long shadow of darkness that I posit have fallen upon perhaps one of the most influential companies in history.
The fact is that leaping from connecting friends to managing the conversations allows Facebook to create this petri dish to understand social interactions at large scale eased by their fine technology platform. To that end, they are moving into alternative distribution channels to create broader reach into global audience and to gather deeper insights into the interaction templates of the participants. The possibilities are immense: in that, this platform can be a collaborative beachhead into discoveries, exploration, learning, education, social and environmental awareness and ultimately contribute to elevated human conscience. But it has faltered, perhaps the shareholders and the analysts are much to blame, on account of the fangled existence of market demands and it has become one global billboard for advertisers to promote their brands. Darkness at noon is the most appropriate metaphor to reflect Facebook as it is now.
Let us take a small turn to briefly look at some of other very influential companies that have not been as much derailed as has Facebook. The companies are Twitter, Google and LinkedIn. Each of them are the leaders in their category, and all of them have moved toward monetization schemes from their specific user base. Each of them has weighed in significantly in their respective categories to create movements that have or will affect the course of the future. We all know how Twitter has contributed to super-fast news feeds globally that have spontaneously generated mass coalescence around issues that make a difference; Google has been an effective tool to allow an average person to access information; and LinkedIn has created professional and collaborative environment in the professional space. Thus, all three of these companies, despite supplementing fully their appetite for revenue through advertising, have not compromised their quintessence for being. Now all of these companies can definitely move their artillery to encompass the trajectory of FB but that would be a steep hill to climb. Furthermore, these companies have an aura associated within their categories: attempts to move out of their category have been feeble at best, and in some instances, not successful. Facebook has a phenomenal chance of putting together what they have to create a communion of knowledge and wisdom. And no company exists in the market better suited to do that at this point.
One could counter that Facebook sticks to its original vision and that what we have today is indeed what Facebook had planned for all along since the beginning. I don’t disagree. My point of contention in this matter is that though is that Facebook has created this informal and awesome platform for conversations and communities among friends, it has glossed over the immense positive fallout that could occur as a result of these interactions. And that is the development and enhancement of knowledge, collaboration, cultural play, encourage a diversity of thought, philanthropy, crowd sourcing scientific and artistic breakthroughs, etc. In other words, the objective has been met for the most part. Thank you Mark! Now Facebook needs to usher in a renaissance in the courtyard. Facebook needs to find a way out of the advertising morass that has shed darkness over all the product extensions and launches that have taken place over the last 2 years: Facebook can force a point of inflection to quadruple its impact on the course of history and knowledge. And the revenue will follow!
Consider this. Your professional career is a series of projects. Employers look for accountability and performance, and they measure you by how you fare on your projects. Everything else, for the most part, is white noise. The projects you work on establish your skill set and before long – your career trajectory. However, all the great stuff that you have done at work is for the most part hidden from other people in your company or your professional colleagues. You may get a recommendation on LinkedIn, which is fairly high-level, or you may receive endorsements for your skills, which is awesome. But the Endorsements on LinkedIn seem a little random, don’t they? Wouldn’t it be just awesome to recognize, or be recognized by, your colleagues for projects that you have worked on. We are sure that there are projects that you have worked on that involves third-party vendors, consultants, service providers, clients, etc. – well, now you have a forum to send and receive recognition, in a beautiful form factor, that you can choose to display across your networks.
Imagine an employee review. You must have spent some time thinking through all the great stuff that you have done that you want to attach to your review form. And you may have, in your haste, forgotten some of the great stuff that you have done and been recognized for informally. So how cool would it be to print or email all the projects that you’ve worked on and the recognition you’ve received to your manager? How cool would it be to send all the people that you have recognized for their phenomenal work? For in the act of participating in the recognition ecosystem that our application provides you – you are an engaged and prized employee that any company would want to retain, nurture and develop.
Now imagine you are looking for a job. You have a resume. That is nice. And then the potential employer or recruiter is redirected to your professional networks and they have a glimpse of your recommendations and skill sets. That is nice too! But seriously…wouldn’t it be better for the hiring manager or recruiter to have a deeper insight into some of the projects that you have done and the recognition that you have received? Wouldn’t it be nice for them to see how active you are in recognizing great work of your other colleagues and project co-workers? Now they would have a more comprehensive idea of who you are and what makes you tick.
We help you build your professional brand and convey your accomplishments. That translates into greater internal development opportunities in your company, promotion, increase in pay, and it also makes you more marketable. We help you connect to high-achievers and forever manage your digital portfolio of achievements that can, at your request, exist in an open environment. JuggleStars.com is a great career management tool.
Check out www.jugglestars.com
Posted in Employee Engagement, Employee retention, Extrinsic Rewards, Innovation, Intrinsic Rewards, Leadership, Learning Organization, Learning Process, Motivation, Organization Architecture, Recognition, Rewards, Social Dynamics, Social Network, Social Systems, Talent Management
Tags: communication channel, conversation, crowdsource, employee engagement, employee recognition, extrinsic motivation, intrinsic motivation, learning organization, mass psychology, social network, social systems, talent management
About JuggleStars www.jugglestars.com
Please support Jugglestars. This is an Alpha Release. Use the application in your organization. The Jugglestars team will be adding in more features over the next few months. Give them your feedback. They are an awesome team with great ideas. Please click on www.jugglestars.com and you can open an account, go to Account Settings and setup your profile and then you are pretty much ready to go to recognize your team and your colleagues at a project level.
Posted in Corporate Social Responsibility, Employee Engagement, Extrinsic Rewards, Gamification, Innovation, Intrinsic Rewards, Leadership, Learning Organization, Recognition, Rewards, Social Causes, Social Network, Social Systems, Talent Management
Tags: communication channel, connection, conversation, employee engagement, employee recognition, extrinsic motivation, intrinsic motivation, learning organization, organization architecture, social network, social systems, talent management, value management
“The reality distortion field was a confounding mélange of a charismatic rhetorical style, an indomitable will, and an eagerness to bend any fact to fit the purpose at hand. If one line of argument failed to persuade, he would deftly switch to another. Sometimes, he would throw you off balance by suddenly adopting your position as his own, without acknowledging that he ever thought differently. “
– Andy Hertzfield on Steve Jobs’ Reality Distortion Field.
Many of us have heard the word – Reality Distortion Field. The term has been attributed to Steve Jobs who was widely known to have communicated messages to his constituency in a manner such that the reality of the situation was supplanted by him packaging the message so that people would take the bait and pursue paths that would, upon closer investigation, be dissonant from reality. But having been an avid acolyte of Jobs, I would imagine that he himself would be disturbed and unsettled by the label. Since when did the promise of a radiant future constitute a Reality Distortion Field? Since when did the ability of a person to embrace what seemingly is impossible and far-fetched and instill confidence in the troops to achieve it constitute a Reality Distortion Field? Since when did the ability of leadership to share in the wonders of unique and disruptive creations constitute a Reality Distortion Field? Since when did dreams of a better future underpinned with executable actions to achieve it constitute a Reality Distortion Field?
The Reality Distortion Field usage reflects the dissonance between what is and what needs to be. It is a slapstick term which suggests that you are envisioning tectonic dissonance rifts between reality and possibilities and that you are leading the awestruck starry-eyed followers off a potential cliff. Some people have renamed RDF as hype of Bulls*#t. They believe that RDF is extremely bad for organizations because it pushes the people outside the comfort zone of physical and logical constraints and is a recipe for disaster. The argument continues that organizations that are grounded upon the construct of reality and to communicate the same are essential to advance the organization. I beg to differ.
So let me address this on two fronts: RDF label and if we truly accept what RDF means … then my position is that it is the single most important attribute that a strong leader ought to embrace in the organization.
The RDF label:
We all know this to be true: A rose by any other name is still a rose. We just happen to call this rose in this context a RDF. It is presumed to be the ability of a person to cast possibilities in a different light … so much so that the impossibilities are reduced to elements just within the grasp of reality. Now I ask you – What is wrong with that? For a leader to be able to cast their vision within the inimitable grasp of an organization is a huge proxy for the faith of the leader of the people in the organization. If a project realistically would take 3 months but a RDF is cast to get a project done in 15 days – that is a tall order – but think of the consequences if people are “seduced” into the RDF and hence acts upon it. It immediately unfolds new pathways of collaboration, unforeseen discoveries into super-efficient and effective methods, it creates trench camaraderie, it distills focus into singularity points to be executed against, it instills and ignites a passion and an engagement around the new stakes in the ground, people become keepers of one another for a consequential and significant conquest, it brings out the creative energies and the limitless possibilities, once the goal is accomplished, of disruptive innovation in means and ends. Of course, one could also counter-argue a plethora of incidental issues in such cases: employees would burn out under the burden of unrealistic goals, employees are set more for failing than succeeding, it would create a disorderly orientation upon groups working together to meet RDF standards, and if one were to fall short …it would be a last straw that may break the camel’s back. So essentially this speaks to the ordinal magnitude of the RDF schema that is being pushed out by leadership.
RDF and the beneficial impact to an organization:
It is the sine qua non of great leadership to be able to push organizations beyond the boundaries of plain convenience. I have, in my career, been fortunate to have been challenged and on many occasions, forced out of my comfort zone. But in having done so successfully on many occasions, it has also given me the confidence to scale mountains. And that confidence is a perquisite that the organization leadership has to provide on a daily basis. After all, one of the biggest assets that an employee in an organization ought to have is pride and sense of accomplishment to their work. RDF unfolds that possibility.
We hear of disruptive innovations. These are defined as innovations that leapfrog the bounds of technology inertia. How does a company enable that? It is certainly not incremental thinking. It is a vision that marginally lies outside our aggregated horizon of sight. The age today which is a result of path breaking ideas and execution have been a result of those visionaries that have aimed beyond the horizons, instilled faith amongst the line men to align and execute, and made the impossible possible. We ought to thank our stars for having leaders that emit an RDF and lead us off our tenebrous existence in our diurnal professional lives.
There is absolutely no doubt that such leadership would create resistance and fierce antipathy among some. But despite some of the ill effects, the vector that drives great innovations lies in the capacity of the organization to embrace degrees of RDF to hasten and make the organizations competitive, distinctive and powerful.
Tags: boundaries, communication channel, creativity, discipline, extrinsic motivation, focus, innovation, intrinsic motivation, learning organization, meaning, organization architecture, strategy, vision