Thursday, February 19, 2009

Bush Fires and Web 2.0

With fires still blazing in Melbourne I have to comment that it's obvious that Web 2.0 social networks can save lives in Australia's bush fires and other disasters by enhancing the delivery of timely information to residence through multiple channels including the Web (PC and mobile) and SMS. I was driving home today and heard an "urgent alert" message, "The fire in the Mount Riddell sector east of Healesville has spotted over containment lines and is actively burning south of Narbethong in the Dom Dom area.  The communities of the Narbethong area may be directly impacted upon by this fire." The message went on to tell people to enact their fire plans now. But what happens if you're not listening to the radio at that moment, or you're not on your PC checking the CFA website, or close enough to the fire brigade to hear a siren then how does one know that there is imminent danger? Timely information is critical. The vast majority of Australians have mobile phones and 3G devices like the iPhone and carry them around all the time. A simple SMS notification can alert us to act right now. The ability to message back, is the key difference with social networking. The typical pattern is that I can reply or comment back to a message and just as importantly my comment back can be seen by others that want to see it. This allows everyone to be involved and provide vital information when the information flow is absolutely critical.

So while it's nice to know what someone is doing right now on Facebook or Twitter a better use of the technology would be to inform residence of impending danger and allow people to message back about their status and situation.

This is more than just a thought; it can and should be the reality.

Tuesday, October 21, 2008

Social Computing's Potential Impact on Business

We are in the era of social computing. First, I’ll explain what I mean by the social computing era and then continue to describe how social computing might impact business decision making.
The Internet, globalization, technology convergence, the global media, cheap and highly available computing devices, mobile technologies, and a new generation of human beings that have only known the high tech, highly connected world we live in today are coagulating in every rapidly increasing spirals to form a world where information is free. By free I not only mean at no cost, but also freed from the hierarchies of bureaucracies that have filtered and regulated information for centuries.

Today, when information is created it is virtually instantaneously published by numerous and various individuals and organizations each projecting to some degree, their perception of the information. Individuals can make their own decisions as to which parts of the information they believe to be fact, which parts are spin and which parts are fiction. Far from chaos this has created an environment of transparency of information, its sources and their motivations; and engagement as people commit to their minds and consequently, their efforts to causes, ideas, principles, and other people.

The connections between ideas and people that engage with ideas are happening at Internet speeds! It’s scope and scale is unlimited because social computing is not only about computing; it’s about the way computing (including communications) is working hand in glove with the real world. People talk to each other about ideas. Ideas evolve. The group evolves. Our views evolve. Society evolves. Youngsters are evolving faster than everyone else.

Knowledge is power! Traditional powerbrokers are tightly clenching traditional information hierarchies with iron gloves. But like sand information is drawn out by the force of gravity. Information is free!

The social computing era is not unlike the computing eras that have preceded it. Each era of computer evolution builds on the previous era. Bell’s Law predicts a new computing platform about every decade.[1]

In his definition of Web 2.0, Tim O’Reilly, the man that coined the term “Web 2.0” states, “The Web is the Platform”. I’ll take the liberty to further qualify that definition to the include the Web and every device connected to it is the platform. This includes PCs and convergent devices like Apple’s iPhone, Nokia’s N95, Blackberry’s Bold, Palm’s Treo, etc. that connect us to the Internet and IP (Internet Protocol) based services allowing us to connect to information and people anywhere, anytime.

So, what is social computing? One way to simply understand the social computing era, is to put a boundary on personal computing. What’s your experience of PCs when they are NOT connected? Once the PC is disconnected from the Internet you’re left with a personal computer which is a wonderful device for personal productivity, great for creating documents, spreadsheets, presentations, databases, and even personal entertainment. But think of the limitations presented when you can’t connect. We connect now to share, communicate, interact and collaborate for business and personal reasons. This is the essence social computing. It’s our use of technology to extend and scale our very human and insatiable desire to communicate with other people for all of the reasons that we have communicated for millions of years. We want to share our thoughts and feelings. We want to contribute to the group. We want to be recognized. We want to be understood. We want to be wanted and needed. We want to belong. We want make a difference. We want to feel significant. These are very strong motivations that won’t be constrained or squashed by any organization, regime, government, or society because they are the motivations that make us human. Our desire to express ourselves and the ingenuity we apply to find a way to be heard that makes me think that the trajectory of social computing is inevitable. We want to be free.

Social Computing and Business

Social computing transcends the personal computer, extends to converged devices and is all about personal and business social interaction.

Social computing has changed the human-to-computer interface to human-to-computer-to-x-to-computer-to-human interface. Social computing takes us from the WHAT information provided by traditional (pre-social computing) applications – like what is the order number – to the WHY information – like why did that customer change their order. “WHAT” information tends to be formal. “WHY” information tends to be informal. It’s the bits of information like ratings, reviews, comments, opinions, likes, dislikes and personal perspectives and opinions captured in the form of user generated content (UGC) such as Wikis and Blogs, that forms a new layer of information as metadata upon which new standards of credibility, authenticity, authority and trust are being created.

Computing platforms shape or determine how humans use computers and computers have accelerated human social evolution. Continuous exponential improvement of computer processing power, scale and usability; coupled with continually decreasing price points (Moore’s Law[2] and Metcalfe’s Law[3]) for technology; and ubiquitous uptake and use of computers have forever changed the shape of human social interaction. Businesses are not isolated from these trends.

Business information processing and information management capability has advanced with each era of computing and each era of computing has been underpinned by technology platforms (hardware and software) that determine how and why people interact and process information with computers. In the 1960’s computers were dominated by mainframe and mainframe operating systems and mainframe programming languages. The good ol’ green screen data terminals were the human interface to proprietary mainframes. Mainframe computers centrally processed information providing businesses with more accurate and more efficient accounting related information. People input data into mainframe computers providing businesses with the ability to scale by processing transactions more accurately and efficiently.

In the late 1970’s and early 1980’s the mainframe was followed by mini-computers which were basically the same as mainframe computers but were smaller and less expensive expanding the use of computers to a larger market of medium and small organizations. Mini computers are very similar to mainframes in hardware and software architecture and their proprietary nature.
Mini’s were followed by Personal Computers in the early 1980’s which opened up computer use to every business and eventually to every home. Personal computers are architecturally significantly different from mainframes and minis. These differences are based on the PC basic design concept of personal use versus the mainframe and minicomputer basic design concept of multi user and business or organizational use. Shortly thereafter, PCs were networked to one another in offices through local area networks (LANS), across cities through municipal area networks (MANs) and across countries and the globe through wide area networks (WANs).
In the 1990’s computers and private computer networks (LANs, MANs and WANs) were connected to the Internet and World Wide Web (Web 1.0). Suddenly globally publishing discoverable information became easy. Online commerce added transactions and some business processes. And at the end of the decade the dot.com boom (and bust) created new businesses and communications paradigms. You will note that none of the computing platforms from previous computing eras has disappeared but have continued to be viable platforms for specific use.

The first half of the first decade of the new millennium saw Web 1.0 consolidation and growth of successful web companies like Google, Amazon, e-Bay, Yahoo!, and most recently, MySpace and Facebook. These companies have used eight (8) design patterns that Tim O’Reilly has defined as Web 2.0 (http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html?page=5). Four of the patterns are social.

1. The Long TailSmall sites make up the bulk of the internet's content; narrow niches make up the bulk of internet's the possible applications. Therefore: Leverage customer-self service and algorithmic data management to reach out to the entire web, to the edges and not just the center, to the long tail and not just the head.

3. Users Add ValueThe key to competitive advantage in internet applications is the extent to which users add their own data to that which you provide. Therefore: Don't restrict your "architecture of participation" to software development. Involve your users both implicitly and explicitly in adding value to your application.

4. Network Effects by DefaultOnly a small percentage of users will go to the trouble of adding value to your application. Therefore: Set inclusive defaults for aggregating user data as a side-effect of their use of the application.

7. Cooperate, Don't ControlWeb 2.0 applications are built of a network of cooperating data services. Therefore: Offer web services interfaces and content syndication, and re-use the data services of others. Support lightweight programming models that allow for loosely-coupled systems.

So, not only is computer use social, Web 2.0 systems incorporate social design patterns from inception. This is a significant departure from designing systems based on the previously mentioned platforms in the previous computing eras. It also highlights the futility of trying to re-engineer systems designed in the previous computing eras into social computing systems.
The really cool thing about this is humans are social beings and as we’ve seen from MySpace and Facebook, social computing platforms give us the ability to scale our social behavior. Now instead of the 5 friends that I used to stay in touch with, I can stay in touch with 50 or 500 or even 5,000 friends! And, they can be anywhere on the planet as long as they have access to the Internet (preferably broadband).

The impact of social computing in business or the enterprise has been widely discussed.
In the Spring of 2006, Harvard Associate Professor, Andrew McAfee, coined the term “Enterprise 2.0” in the MIT Sloan Management Review article, “Enterprise 2.0: The Dawn of Emergent Collaboration”. The article states, “These new digital platforms for generating, sharing and refining information are already popular on the Internet, where they’re collectively labeled “Web 2.0” technologies. I use the term “Enterprise 2.0” to focus only on those platforms that companies can buy or build in order to make visible to practices and outputs of their knowledge workers.”

Gartner defined Enterprise Social Software at its Integration & Web Services Summit in 2007, “Enterprise social software provides an open and freeform environment that 1) stimulates large-scale participation through informal interactions, and 2) aggregates these interactions into an emergent structure that reflects the collective attitudes, dispositions and knowledge of participants.”

In his book, “The Wisdom of Crowds”, James Surowiecki states, “under the right circumstances, groups are remarkably intelligent, and are often smarter than the smartest people in them. Groups do not need to be dominated by exceptionally intelligent people in order to be smart. Even if most of the people within the group are not especially well informed or rational, it can still reach a collectively wise decision.” He goes on to identify, “the conditions necessary for the crowd to be wise: diversity, independence and a particular kind of decentralization.” He also focuses his book on three kinds of problems: cognition – problems that have or will have definitive solutions, coordination – problems that require members of a group to coordinate their behavior with each other, know that everyone else is trying to do the same, and cooperation – problems that involve the challenge of getting self-interested, distrustful people to work together, even when narrow self-interest would seem to dictate that no individual should take part.

Businesses face these types of problems everyday and typically decisions may be made unilaterally by an executive or a manager, or by committee or with some degree of consultation but almost never by polling a crowd.

Interestingly, Surowiecki points out that the worst way to make a decision is the way most organizations make decisions today. That is with teams that are bred in antithetical conditions to the ones necessary for the crowd to be wise. Many decision making teams are NOT DIVERSE – often they’re homogenous, NOT INDEPENDENT – often they’re completely interdependent and NOT DECENTRALIZED – often they’re centralized. The recent Wall Street melt down as with other business catastrophes we can see a small group of people making decisions based on a homogeneous perspective. In the absence of an environment that actively pursues diversity, independence and decentralization, presenting oppositional views to the group, committee or team will most likely be disastrous for ones career. The lack of diversity, independence and decentralization, particularly the first two, are attributes that will cause one that can bring variety to the group, the person that is different, to be rejected and ostracized.

I find it very interesting that democratic society has faith in crowd decisions. Every democratic electoral process is based on crowd wisdom. We trust that a majority vote will take a nation in the “right” direction and yet we run companies as if they were monarchies or dictatorships and disempower employees reducing them to being cogs in a machine rather than thinking human beings. Regretfully, many organizations waste the talent and capabilities that can stimulate innovation and organizational transformation. It’s no wonder many workers just show up to work to punch the clock and collect a check.

Enterprise Web 2.0 technologies can begin to unleash and harness the wisdom of the crowd within large organizations and large communities. These technologies make it possible to apply decision science to organizational decision making. Decision scientists have the opportunity to create decision making frameworks that are more transparent, inclusive and simple and to track the quality of decision making based on outcomes. Standardizing decision making processes in organizations leads to improved governance and reduces the risk that decisions are being made without maximizing organizational wisdom. A decision making framework that complies with the framework described by Surwiecki would be a very interesting start. By combining Enterprise 2.0 technologies with a methodology for making certain types of decisions businesses should improve their decision making ability. Imagine the impact an open decision making platform would have on governance, competitiveness, innovation, customer satisfaction, employee satisfaction, brand, efficiency, effectiveness, productivity, research and development, marketing, etc. There isn’t an area of business or government where improved decision making would not be beneficial.

Footnotes:
[1] Established market class computers aka platforms are introduced and continue to evolve at roughly a constant price (subject to learning curve cost reduction) with increasing functionality (or performance) based on Moore's Law that gives more transistors per chip, more bits per unit area, or increased functionality per system. Roughly every decade, technology advances in semiconductors, storage, networks, and interfaces enable a new, lower cost computer class aka platform to form to serve a new need that is enabled by smaller devices e.g. less transistors per chip, less expensive storage, displays, i/o, network, and unique interface to people or some other information processing sink or source. Each new lower priced class is then established and maintained as a quasi independent industry and market. Such a class is likely to evolve to substitute for an existing class or classes as described above with computer clusters. – Source, Wikipedia.com.

[2] (môrz lâ) (n.) The observation made in 1965 by Gordon Moore, co-founder of Intel, that the number of transistors per square inch on integrated circuits had doubled every year since the integrated circuit was invented. Moore predicted that this trend would continue for the foreseeable future. In subsequent years, the pace slowed down a bit, but data density has doubled approximately every 18 months, and this is the current definition of Moore's Law, which Moore himself has blessed. Most experts, including Moore himself, expect Moore's Law to hold for at least another two decades. – Source, Webopedia.com
[3] Metcalfe's Law is expressed in two general ways:
1) The number of possible cross-connections in a network grows as the square of the number of computers in the network increases.
2) The community value of a network grows as the square of the number of its users increase.
The original statement from Robert M. Metcalfe, inventor of Ethernet, was apparently (according to one source): "The power of the network increases exponentially by the number of computers connected to it. Therefore, every computer added to the network both uses it as a resource while adding resources in a spiral of increasing value and choice."
Metcalfe's Law is often cited as an explanation for the rapid growth of the Internet (or perhaps more especially for the World Wide Web on the Internet). Together, with Moore's Law about the rate at which computer power is accelerating, Metcalfe's Law can be used to explain the rising wave of information technology that we are riding into the 21st century. – Source, Searchnetworking.com

Wednesday, August 20, 2008

Alternatives to paying increased software maintenance fees

Larry Dignan is right to be irate in his blog, Why there's a software support and maintenance revolution underway.

Consider alternatives: Open Source, SaaS and Innovation from small businesses
There are options like Open Source Software whose governance are well documented and widely understood. Businesses can adopt OSS software and run it in production with no upfront software cost. There are a large number of Java Developers and an industry that supports open standard frameworks and architectures. There are also standard software development methodologies like RUP and Agile which can be trained for, tested and audited. The OSS industry is only becoming more and more mature and reliable.

The large companies that are spending millions of dollars on these technologies need to improve the ROI on these software investments right now! I have worked in Australia’s largest businesses in Strategy and Architecture roles providing planning and governance, and repeatedly I’ve seen, what could arguably be called over capitalizing in technology with large software vendors and top tier consulting companies.

To a significant degree, I agree with Nicholas Carr’s views in his article, “The End of Corporate Computing”. As a technologist I love what technology is doing for the world right now (particularly the openness and new levels of social evolution through social networks and Web 2.0). But this is happening in the so called consumer market, not the so called enterprise market. Obviously, businesses have different, and more complex computing needs than consumers but aren’t we all also consumers? We merely play different roles in business or as a consumer. So there are similar requirements, there has to be because we’re the same people inside of our businesses or vocations as we are at the local supermarket buying groceries on the way home from work. As people we want a consistent way to access information inside and outside of our businesses. (BTW, why isn’t corporate information as accessible as information on the Internet?)

Services providers like Google, Amazon, and Yahoo provide FREE services of the highest quality and reliability. There are also OSS players like SugarCRM, Lucene, TerraCotta, Aegeon, Ubuntu, MySQL, JBoss, etc. that have terrific FREE products. Thanks to Google and the Internet all of these companies are only 2 clicks away. Imagine if information in enterprise systems were only 2 clicks away.

Embracing OSS gives organizations the ability to ease into new products with no upfront software investment. This model is the greatest threat to the software providers that are hiking up their maintenance fees and one of businesses greatest levers.

Embracing the open source community also embraces the local service providers that have embraced open source. We attract, train, manage, and retain really good software talent. Your projects mean survival for us.

So, why not do something about it and vote with your dollars and your feet (albeit slowly, cautiously, mitigating risk…)?
I’ll get you started if you’re interested.

Monday, July 21, 2008

The revised definition of Enterprise 2.0

Dr. McAfee’s revised of Enterprise 2.0 provides interesting insight into his view of Enterprise 2.0. As he is the author of the term, “Enterprise 2.0”, everyone else can only choose to agree or disagree with his authentic revised definition.

Andrew gives examples of E2.0 and examples of what’s not E2.0 and here’s what’s NOT Examples of Enterprise 2.0:

· Wikipedia, YouTube, Flickr, MySpace, etc. These are for individuals on the Web, not companies. Some companies use sites like YouTube for viral and stealth marketing, but let's explicitly put these activities outside our definition of Enterprise 2.0.

· Most corporate Intranets today. As discussed earlier, they're not emergent.

· Groupware and information portals. Again, these tools don't facilitate emergence, although this may be starting to change. Groupware and portals also seem to be less freeform than the Web 2.0 technologies now starting to penetrate the firewall.

· Email and 'classic' instant messaging, because transmissions aren't globally visible or persistent. Some messaging technologies do ensure that contributions are persistent.
For me, most interesting are the 3rd and 4th bullet points because they exclude most legacy Groupware applications which many vendors are trying to throw into the Enterprise 2.0 bucket simply to take advantage of the buzz. This mislabeling of products does themselves, their customers and the technology industry a disservice and creates unnecessary and pointless market confusion.

Firstly, the vendors do themselves a disservice because rather than trying to reposition existing products they should continue to invest in R&D and innovation to create new products that are authentically Enterprise 2.0 compliant.

Secondly, their customers are being disserviced because they’re being misguided by trusted vendors with products that won’t deliver the potential benefits of authentic social computing and being sold propositions that won’t stand the test of time.

Thirdly, the technology industry is being disserviced because we are on the cusp of a new era in computing – the Social Computing Era and we, the technology industry should be doing everything we can to accelerate its adoption because its evolution unleash the untapped resource of social collaboration within organizations which is the catalyst for the emergence of the information economy where, as Steve Jurvetson predicts the market trend where value of information traded will outstrip to value of physical goods traded. More importantly, this market is the realization of a more balanced lifestyle for the knowledge worker and their families. The flattening and erosion of non-value-adding corporate hierarchies, dismantling of non-value-adding tiers of middle management and the creation of an economy based on the value creation of knowledge transfer on a just-in-time basis.

Thursday, May 15, 2008

How Spaceo.us Enhances SOA by increasing the reusability of information

Service Oriented Architecture aligns business projects’ technical architectures to a broader and more holistic, Enterprise IT Architecture strategy. These strategies are architecturally underpinned by the Services Oriented Patterns of publish and subscribe and other event driven service patterns. The patterns deliver orchestrated business processes through BPEL and BPM with the goal of enabling business agility. SOA also has the promise of delivering functional reusability from the underlying application code thus reducing the required software development aspect of IT projects and in turn, reducing the capital costs and time it takes to deliver business initiatives – projects – which execute business strategy. BUSINESS AGILITY!
Ahhhh, business agility, the new holy grail of business. SOA projects deliver flexibility from a technical perspective. Existing application software is decomposed, broken down into discrete chunks of business functionality – the steps in a process – which can be identified, managed and maintained independently through registry based governance. Very technical…. The challenge is how do you communicate such technical and abstract concepts from IT to business and back again, time after time after time, as business continually morphs to deal with a dynamic and volatile global business environment. How do you know how reusable a business process (or service) will be even before there is a real, business driven requirement to implement the process a second time? Without this foreknowledge of business requirement (which is to some degree, achievable through enterprise architecture) one runs the risk of over investing (over capitalizing) in IT infrastructure. SOA project can increase the short-term costs of projects with the promise of delivering reusability, thus reducing the cost the second time one needs to use the business process (or service) – and again, we’re on that slippery slope.
However, there are real world examples of reusability being delivered on the Internet today. Google Maps has all but become the ubiquitous de facto standard for mapping functionality (of business process or service). As Youtube has for video, as Flickr has for pictures, as Wikipedia has for reference, etc. , these Service Providers, Google, Yahoo!, Amazon, E-Bay, etc. are delivering services which are embedded into human driven (or orchestrated) business processes through Mash-ups.
Now comes the excitement of Web 2.0!
That right. That’s why business can get excited about Web 2.0. Because in Web 2.0, people – human beings – orchestrate business process on the fly (at runtime) when the appropriate tools are made available through computers and other processor based computing devices like PDA, mobile phones, video (television), etc.
O.K. this is a new phenomenon. This is the first time in human history that we can do this. Computers, the Internet, global communications network, technology convergence, etc., yes, this is the first time this is happening. What many, including myself, have observed is that people use access, use, re-use, step through and source information in different ways. The way we, people process information is unique to each and every one of us. This is what doctors are saying about the brain. Each human brain is unique. It has to be because each of us a completely unique life experience. So, providing the services (maps, video, reference, pictures, order entry, shopping basket, provisioning, training, etc.) … you see, the services can also emanate from existing legacy systems when exposed as services through a SOA.
As Dion Hinchcliffe has been telling us, Web 2.0 is where the SOA and business converge.
Web 2.0 tools or features or services, like wikis, blogs, comments, ratings, reviews, etc. let users generate content through collaboration. The Social Collaboration Layer is where the world of Web 2.0 and the world of SOA converge.
Resultantly, Spaceo.us, the first and only complete implementation of the Social Collaboration Layer, (designed from inception as a lightweight, Web 2.0, RESTful, WOA implementation) is completely synergistic with SOA implementations. As a matter of fact, it will, undoubtedly increase the ROI of current SOA implementations by exposing users, people to services, that can be mashed up (into composite applications) and orchestrated into business processes on the fly through Spaceo.us.
Once services are embedded in Spaceo.us they can be made available to users, when they need them, at runtime, just like Google Maps. Taking this approach lets users determine the scaling requirements of services based on actual use.
Exposing services to users through Spaceo.us lets users orchestrate business process on-the-fly whilst adhering to SOA governance and application level access control.

Monday, May 5, 2008

Spaceo.us the Social Collaboration (a.k.a. social networking) Platform for the Enterprise (2.0)

Spaceo.us the Social Collaboration (a.k.a. social networking) Platform for the Enterprise (2.0)
A strange thing happened on our way to developing a social collaboration platform. We kinda serendipitously tripped over the Social Collaboration Layer. I’ve been involved (well, more than involved) in the development of a Social Collaboration Platform called Spaceo.us. Last week, we released the Beta for UAT. We (Aegeon PL) have been developing it since October 2007 and it’s the third generation of ideas that we’ve had about social collaboration software. The funny thing is that Spaceo.us fulfils the requirements of a Social Collaboration Layer from an enterprise architectural perspective.
Here’s what I mean…

The red social collaboration layer is where wikis, blogs, tagging, comments, ratings, reviews and lots of other Web 2.0 social networking tools and any tools that are lightweight and allow the creation of meta data.
We demonstrated Spaceo.us today to Timothy Hart, Director – Information, Multimedia and Technology for Museum Victoria in Melbourne, Australia. He said social collaboration “sets information free”. I love that! It is exactly what realizing that there is a social collaboration layer does. It will set Enterprise Architects free to fit Web 2.0 into the enterprise (Enterprise 2.0).
Now things will get really exciting!

Tuesday, August 21, 2007

The Trouble with Enterprise Software

Cynthia Rettig has hit the nail on the head in her article “The Trouble with Enterprise Software” (http://sloanreview.mit.edu/smr/issue/2007/fall/01/)! This is based on my personal experience with multiple SAP implementations. Whenever an organisation commits to the levels of investment required to implement corporate politics will come into play. Executive stakeholders in the project will spin the ERP implementation into a success even when it the monolithic system stifles an organisation's ability to change. And that's the crux of the problem. It's not that ERP systems don't create efficiencies, it's the efficiencies are based on a point in time. ERP systems often cement an organisation into a particular way of doing business. As the business environment changes, the cost of changing business processes within the ERP systems outweighs the potential benefits of market opportunities on a case by case basis. This is particularly true for speculative new products and new market opportunities where the revenue growth is uncertain. The high cost of changing these systems often makes the business case for new opportunities inviable. As a result, line of business managers may be forced to use smaller systems to launch new opportunities thus adding to the complexity of enterprise IT environments. Enterprise SOA will not be realised because the solutions developed by most vendors are too heavyweight and cumbersome. The best examples of SOA in action are web mashups. Organisations striving to achieve agility by implementing heavyweight middle tier stacks are barking up the wrong tree. Why is it that one doesn’t picture an ocean liner, or a 747, or anything that is large and complex when we envisage “agility”. We always picture lean, lightweight, hyper… Companies should look to small, lightweight, simple services to deliver agile business processes. Leave the legacy processes to the ERP systems but build future nimbleness on lightweight web centric architectures.

read more | digg story