We are in the era of social computing. First, I’ll explain what I mean by the social computing era and then continue to describe how social computing might impact business decision making.
The Internet, globalization, technology convergence, the global media, cheap and highly available computing devices, mobile technologies, and a new generation of human beings that have only known the high tech, highly connected world we live in today are coagulating in every rapidly increasing spirals to form a world where information is free. By free I not only mean at no cost, but also freed from the hierarchies of bureaucracies that have filtered and regulated information for centuries.
Today, when information is created it is virtually instantaneously published by numerous and various individuals and organizations each projecting to some degree, their perception of the information. Individuals can make their own decisions as to which parts of the information they believe to be fact, which parts are spin and which parts are fiction. Far from chaos this has created an environment of transparency of information, its sources and their motivations; and engagement as people commit to their minds and consequently, their efforts to causes, ideas, principles, and other people.
The connections between ideas and people that engage with ideas are happening at Internet speeds! It’s scope and scale is unlimited because social computing is not only about computing; it’s about the way computing (including communications) is working hand in glove with the real world. People talk to each other about ideas. Ideas evolve. The group evolves. Our views evolve. Society evolves. Youngsters are evolving faster than everyone else.
Knowledge is power! Traditional powerbrokers are tightly clenching traditional information hierarchies with iron gloves. But like sand information is drawn out by the force of gravity. Information is free!
The social computing era is not unlike the computing eras that have preceded it. Each era of computer evolution builds on the previous era. Bell’s Law predicts a new computing platform about every decade.
[1]In his definition of Web 2.0, Tim O’Reilly, the man that coined the term “Web 2.0” states, “The Web is the Platform”. I’ll take the liberty to further qualify that definition to the include the Web and every device connected to it is the platform. This includes PCs and convergent devices like Apple’s iPhone, Nokia’s N95, Blackberry’s Bold, Palm’s Treo, etc. that connect us to the Internet and IP (Internet Protocol) based services allowing us to connect to information and people anywhere, anytime.
So, what is social computing? One way to simply understand the social computing era, is to put a boundary on personal computing. What’s your experience of PCs when they are NOT connected? Once the PC is disconnected from the Internet you’re left with a personal computer which is a wonderful device for personal productivity, great for creating documents, spreadsheets, presentations, databases, and even personal entertainment. But think of the limitations presented when you can’t connect. We connect now to share, communicate, interact and collaborate for business and personal reasons. This is the essence social computing. It’s our use of technology to extend and scale our very human and insatiable desire to communicate with other people for all of the reasons that we have communicated for millions of years. We want to share our thoughts and feelings. We want to contribute to the group. We want to be recognized. We want to be understood. We want to be wanted and needed. We want to belong. We want make a difference. We want to feel significant. These are very strong motivations that won’t be constrained or squashed by any organization, regime, government, or society because they are the motivations that make us human. Our desire to express ourselves and the ingenuity we apply to find a way to be heard that makes me think that the trajectory of social computing is inevitable. We want to be free.
Social Computing and Business
Social computing transcends the personal computer, extends to converged devices and is all about personal and business social interaction.
Social computing has changed the human-to-computer interface to human-to-computer-to-x-to-computer-to-human interface. Social computing takes us from the WHAT information provided by traditional (pre-social computing) applications – like what is the order number – to the WHY information – like why did that customer change their order. “WHAT” information tends to be formal. “WHY” information tends to be informal. It’s the bits of information like ratings, reviews, comments, opinions, likes, dislikes and personal perspectives and opinions captured in the form of user generated content (UGC) such as Wikis and Blogs, that forms a new layer of information as metadata upon which new standards of credibility, authenticity, authority and trust are being created.
Computing platforms shape or determine how humans use computers and computers have accelerated human social evolution. Continuous exponential improvement of computer processing power, scale and usability; coupled with continually decreasing price points (Moore’s Law
[2] and Metcalfe’s Law
[3]) for technology; and ubiquitous uptake and use of computers have forever changed the shape of human social interaction. Businesses are not isolated from these trends.
Business information processing and information management capability has advanced with each era of computing and each era of computing has been underpinned by technology platforms (hardware and software) that determine how and why people interact and process information with computers. In the 1960’s computers were dominated by mainframe and mainframe operating systems and mainframe programming languages. The good ol’ green screen data terminals were the human interface to proprietary mainframes. Mainframe computers centrally processed information providing businesses with more accurate and more efficient accounting related information. People input data into mainframe computers providing businesses with the ability to scale by processing transactions more accurately and efficiently.
In the late 1970’s and early 1980’s the mainframe was followed by mini-computers which were basically the same as mainframe computers but were smaller and less expensive expanding the use of computers to a larger market of medium and small organizations. Mini computers are very similar to mainframes in hardware and software architecture and their proprietary nature.
Mini’s were followed by Personal Computers in the early 1980’s which opened up computer use to every business and eventually to every home. Personal computers are architecturally significantly different from mainframes and minis. These differences are based on the PC basic design concept of personal use versus the mainframe and minicomputer basic design concept of multi user and business or organizational use. Shortly thereafter, PCs were networked to one another in offices through local area networks (LANS), across cities through municipal area networks (MANs) and across countries and the globe through wide area networks (WANs).
In the 1990’s computers and private computer networks (LANs, MANs and WANs) were connected to the Internet and World Wide Web (Web 1.0). Suddenly globally publishing discoverable information became easy. Online commerce added transactions and some business processes. And at the end of the decade the dot.com boom (and bust) created new businesses and communications paradigms. You will note that none of the computing platforms from previous computing eras has disappeared but have continued to be viable platforms for specific use.
The first half of the first decade of the new millennium saw Web 1.0 consolidation and growth of successful web companies like Google, Amazon, e-Bay, Yahoo!, and most recently, MySpace and Facebook. These companies have used eight (8) design patterns that Tim O’Reilly has defined as Web 2.0 (
http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html?page=5). Four of the patterns are social.
1. The Long TailSmall sites make up the bulk of the internet's content; narrow niches make up the bulk of internet's the possible applications. Therefore: Leverage customer-self service and algorithmic data management to reach out to the entire web, to the edges and not just the center, to the long tail and not just the head.
3. Users Add ValueThe key to competitive advantage in internet applications is the extent to which users add their own data to that which you provide. Therefore: Don't restrict your "architecture of participation" to software development. Involve your users both implicitly and explicitly in adding value to your application.
4. Network Effects by DefaultOnly a small percentage of users will go to the trouble of adding value to your application. Therefore: Set inclusive defaults for aggregating user data as a side-effect of their use of the application.
7. Cooperate, Don't ControlWeb 2.0 applications are built of a network of cooperating data services. Therefore: Offer web services interfaces and content syndication, and re-use the data services of others. Support lightweight programming models that allow for loosely-coupled systems.
So, not only is computer use social, Web 2.0 systems incorporate social design patterns from inception. This is a significant departure from designing systems based on the previously mentioned platforms in the previous computing eras. It also highlights the futility of trying to re-engineer systems designed in the previous computing eras into social computing systems.
The really cool thing about this is humans are social beings and as we’ve seen from MySpace and Facebook, social computing platforms give us the ability to scale our social behavior. Now instead of the 5 friends that I used to stay in touch with, I can stay in touch with 50 or 500 or even 5,000 friends! And, they can be anywhere on the planet as long as they have access to the Internet (preferably broadband).
The impact of social computing in business or the enterprise has been widely discussed.
In the Spring of 2006, Harvard Associate Professor, Andrew McAfee, coined the term “Enterprise 2.0” in the MIT Sloan Management Review article, “Enterprise 2.0: The Dawn of Emergent Collaboration”. The article states, “These new digital platforms for generating, sharing and refining information are already popular on the Internet, where they’re collectively labeled “Web 2.0” technologies. I use the term “Enterprise 2.0” to focus only on those platforms that companies can buy or build in order to make visible to practices and outputs of their knowledge workers.”
Gartner defined Enterprise Social Software at its Integration & Web Services Summit in 2007, “Enterprise social software provides an open and freeform environment that 1) stimulates large-scale participation through informal interactions, and 2) aggregates these interactions into an emergent structure that reflects the collective attitudes, dispositions and knowledge of participants.”
In his book, “The Wisdom of Crowds”, James Surowiecki states, “under the right circumstances, groups are remarkably intelligent, and are often smarter than the smartest people in them. Groups do not need to be dominated by exceptionally intelligent people in order to be smart. Even if most of the people within the group are not especially well informed or rational, it can still reach a collectively wise decision.” He goes on to identify, “the conditions necessary for the crowd to be wise: diversity, independence and a particular kind of decentralization.” He also focuses his book on three kinds of problems: cognition – problems that have or will have definitive solutions, coordination – problems that require members of a group to coordinate their behavior with each other, know that everyone else is trying to do the same, and cooperation – problems that involve the challenge of getting self-interested, distrustful people to work together, even when narrow self-interest would seem to dictate that no individual should take part.
Businesses face these types of problems everyday and typically decisions may be made unilaterally by an executive or a manager, or by committee or with some degree of consultation but almost never by polling a crowd.
Interestingly, Surowiecki points out that the worst way to make a decision is the way most organizations make decisions today. That is with teams that are bred in antithetical conditions to the ones necessary for the crowd to be wise. Many decision making teams are NOT DIVERSE – often they’re homogenous, NOT INDEPENDENT – often they’re completely interdependent and NOT DECENTRALIZED – often they’re centralized. The recent Wall Street melt down as with other business catastrophes we can see a small group of people making decisions based on a homogeneous perspective. In the absence of an environment that actively pursues diversity, independence and decentralization, presenting oppositional views to the group, committee or team will most likely be disastrous for ones career. The lack of diversity, independence and decentralization, particularly the first two, are attributes that will cause one that can bring variety to the group, the person that is different, to be rejected and ostracized.
I find it very interesting that democratic society has faith in crowd decisions. Every democratic electoral process is based on crowd wisdom. We trust that a majority vote will take a nation in the “right” direction and yet we run companies as if they were monarchies or dictatorships and disempower employees reducing them to being cogs in a machine rather than thinking human beings. Regretfully, many organizations waste the talent and capabilities that can stimulate innovation and organizational transformation. It’s no wonder many workers just show up to work to punch the clock and collect a check.
Enterprise Web 2.0 technologies can begin to unleash and harness the wisdom of the crowd within large organizations and large communities. These technologies make it possible to apply decision science to organizational decision making. Decision scientists have the opportunity to create decision making frameworks that are more transparent, inclusive and simple and to track the quality of decision making based on outcomes. Standardizing decision making processes in organizations leads to improved governance and reduces the risk that decisions are being made without maximizing organizational wisdom. A decision making framework that complies with the framework described by Surwiecki would be a very interesting start. By combining Enterprise 2.0 technologies with a methodology for making certain types of decisions businesses should improve their decision making ability. Imagine the impact an open decision making platform would have on governance, competitiveness, innovation, customer satisfaction, employee satisfaction, brand, efficiency, effectiveness, productivity, research and development, marketing, etc. There isn’t an area of business or government where improved decision making would not be beneficial.
Footnotes:
[1] Established market class computers aka platforms are introduced and continue to evolve at roughly a constant price (subject to learning curve cost reduction) with increasing functionality (or performance) based on Moore's Law that gives more transistors per chip, more bits per unit area, or increased functionality per system. Roughly every decade, technology advances in semiconductors, storage, networks, and interfaces enable a new, lower cost computer class aka platform to form to serve a new need that is enabled by smaller devices e.g. less transistors per chip, less expensive storage, displays, i/o, network, and unique interface to people or some other information processing sink or source. Each new lower priced class is then established and maintained as a quasi independent industry and market. Such a class is likely to evolve to substitute for an existing class or classes as described above with computer clusters. – Source, Wikipedia.com.
[2] (môrz lâ) (n.) The observation made in 1965 by Gordon Moore, co-founder of Intel, that the number of transistors per square inch on integrated circuits had doubled every year since the integrated circuit was invented. Moore predicted that this trend would continue for the foreseeable future. In subsequent years, the pace slowed down a bit, but data density has doubled approximately every 18 months, and this is the current definition of Moore's Law, which Moore himself has blessed. Most experts, including Moore himself, expect Moore's Law to hold for at least another two decades. – Source, Webopedia.com
[3] Metcalfe's Law is expressed in two general ways:
1) The number of possible cross-connections in a network grows as the square of the number of computers in the network increases.
2) The community value of a network grows as the square of the number of its users increase.
The original statement from Robert M. Metcalfe, inventor of Ethernet, was apparently (according to one source): "The power of the network increases exponentially by the number of computers connected to it. Therefore, every computer added to the network both uses it as a resource while adding resources in a spiral of increasing value and choice."
Metcalfe's Law is often cited as an explanation for the rapid growth of the Internet (or perhaps more especially for the World Wide Web on the Internet). Together, with Moore's Law about the rate at which computer power is accelerating, Metcalfe's Law can be used to explain the rising wave of information technology that we are riding into the 21st century. – Source, Searchnetworking.com