This blog post is a wrap-up of the GilbaneSF 2010 debate on the “Future of Open Source CMS” (#fosc on twitter) with Geoff Bock & Dale Waldt (Gilbane Group), Ian White (The Business Insider), and Jahia’s inputs from an Open Source Content Management vendor perspective.
Today, it is hard to define what an “Open Source CMS vendor” is, since virtually every CMS vendor uses open source in its products, contributes to open source, or provides services around open source. Additionally, most, if not all softwares are dealing with “Content” in one way or another.
To get a clearer picture of the future of Open Source CMS, we need to approach the topic from two different angles:
- The Future of CMS
- The Future of Open Source
CMS is a strange beast whose definition is broad and uncertain. CMS mostly implies two different audiences: techies (CIOs; CTOs; and developers) and practitioners (marketers, information Workers, and lines of business). From this perspective, it is evident we are not dealing with a single “system,” but rather two.
CMS: Content Management Services or Content Management Solutions?
The former stakeholders are looking for a “content platform”, the latter, for finished products and solutions to solve some of their content issues.
As Wikipedia notes:
Application software is contrasted with system software and middleware, which manage and integrate a computer's capabilities, but typically do not directly apply them in the performance of tasks that benefit the user. A simple, if imperfect analogy in the world of hardware would be the relationship of an electric light bulb (an application) to an electric power generation plant (a system). The power plant merely generates electricity, not itself of any real use until harnessed to an application like the electric light that performs a service that benefits the user.
For a long time, CMS was a simple mixture of horizontal infrastructural libraries combined with vertical applications, without any clear segregation of duties. Most CMS solutions available today are still based on this monolithic approach.
Recently, the industry-led (think JCR or CMIS) massive standardization and interoperability effort was coupled with a push to quickly prototype and launch rich content-enabled applications. This combination led to a greater separation of content platforms and content-enabled applications.
Towards Composite Content Platforms and Content-Enabled Applications
Even though the term “composite” has existed for quite some time, only recently did it gain traction, due to its role as the cornerstone of SharePoint 2010, actively pushed by Gartner as a replacement for the older and more limited CEVA term.
Content particles are becoming increasingly granular and structured. Moreover, there is an ever-increasing need to rapidly assemble, cross-link, enrich, and combine heterogeneous content objects. Therefore, the term “composite” sounds convenient and appropriate.
Composite Content Platforms are tomorrow's ECM 2.0
The nice thing about composite content platforms (call them content application servers or content management platforms if you prefer), is that they act as dynamic content containers or as content runtimes, which can run content composite applications. The next generation of composite content applications will be even more dynamic. They will not only glue cold content together, but also will natively inherit from the merge of application servers and content stores, and create hot actionable content-driven applications.
Of course, a simple website could be considered a composite content application (in which case your httpd server could be seen as a kind of first generation and lightweight composite content server). However, composite content applications can also scale to more complex content-enabled applications requiring advanced business processing schema, strong business integration, and heavy personalization requirements.
All these composites content applications can produce and publish massive amounts of content and data, which needs to be correctly managed. And here come the usual content management product families (WCM, DMS, DAM, RM…) that will help manage this deluge of information for all the content-enabled applications.
This platform/product split is quite common, at least as part of the high-end enterprise spectrum of the CMS market niche. It is rapidly moving down and impacting all other sub-segments.
Some recent examples:
- Day CRX vs Day CQ WCM or DAM
- SharePoint Foundation 2010 vs SharePoint Server 2010 Editions
- Alfresco Content Application Server vs Alfresco WCM, DMS, RM
- Nuxeo EP vs Nuxeo DAM, DMS, Case Management, …
- Exo/JBoss GateIn vs Exo Extended Services (DMS, WCM,…)
Due to their historical focus and inherited technologies, some of these frameworks or content foundations, are still driven by a portal-centric approach (e.g. Exo), a document-centric approach (e.g. Nuxeo, Alfresco), or a web-centric approach (e.g. Jahia, Day). However, we can assume that there will be a rapid consolidation towards a universal set of core value-added services able to nurture and enrich any content asset, be it a web page, a document, a record, an email or a scanned fax.
Content Lifecycle Services such as versioning, file plans, workspaces, content types, searching and querying services, interoperability services, mashability services, Social Services, persistence-independent storage services, etc. are becoming commodities.
As open source commoditization is actively ramping and rapidly extending its borders, competitors must decide whether to horizontally extend the level of content services offered as part of their content middleware (e.g. archive more volume, support more load, add new value added content enrichment services), or to go up the value chain and provide new lines of content-enabled products and applications to solve the needs of various business lines. Usually, both expansion strategies are pursued in parallel.
Four main trends are emerging:
- A rapid growth of built-in content enrichment services available for any type of content assets
- Improved content interoperability services at the data level (OpenData; PortableData; CMIS; RDF…)
- The need to quickly assemble, reuse, mashup and reuse existing content assets within various content-enabled applications
- A tsunami of information, which needs to be correctly assessed and managed.
With the rise of composite content platforms and content-enabled applications, we should see a shift from monolithic CMS towards better fractioning ones:
- Composite content platforms which will serve both as a development foundation and as a production runtime for content-enabled applications
- Content applications, which could be rapidly developed and run on top of a composite content platform
- Content management products, which will let users best manage their content assets across all their content applications.
Ideally, the next generation of composite content platforms should ensure a level of data openness and interoperability, aligned with the current CMIS, OpenData and other DataPortability trends.
The goal of this new generation of Composite Content Platforms will be to offer a wide range of content enrichment services, while ensuring proper data interoperability and freedom. Ideally, composite content applications will become more standardized and portable, much as web applications became more standardized during the last decade. However, such a standardization process would take at least 5-10 years. Data Portability will therefore become one of the key purchase criteria.
2) The Future of Open Source: Properly defining the limits of Open Core
The line between proprietary and open source software has become increasingly blurred, as open source software is embedded in proprietary products and extensions. There is also plenty of confusion about the term “community”: community builds, originally based on the unstable development branch, are now promoted as “Freemium” editions for viral marketing purposes. Besides, the scope of “core” features tends to be slowly but surely pared back to boost sales of newly created commercial extensions.
So what can we expect of “Open Core” software vendors? How can we better define ethical and fair boundaries both for open source communities and vendors, while ensuring a reasonable level of open source “purity”?
Simply put, more than 70% of open source contributors are now paid professionals while all open source commercial vendors look for ways to monetize their initial investments. This makes perfect sense, as any commercial entity needs to generate revenues, to pay employees and reward their shareholders.
The Open Core business model is only the latest in a long series of commercial open source business models. Over the past two years, it has rapidly gained momentum. But it is also facing heavy criticisms (cf Gartner or InfoWorld). The model is more and more considered just another type of “Shareware 2.0” or, at best, a lightweight, limited and free SMB edition of the vendor’s main product offering.
Essentially, there is nothing wrong with the Open Core approach and it has existed for years, even if it was not marketed under these terms. It comes down to how the vendor or the community defines the notion, including the scope and the “raison d’être” of the core vs the vendor’s product derivatives.
In today's landscape, we can discern several common pitfalls:
- Unclear core boundaries: there is no clear delineation between features that should remain in the open source core vs those reserved for commercial product derivatives.
- Community Development editions vs Freemium marketing editions. Builds for developers and early adopters are often mixed up with "Gratis" editions to promote and evangelize a product line.
- Community Open Source vs Vendor Controlled Commercial Open Source. Often the underlying intentions of the original contributor regarding its core are unclear. Is it to keep control over the project, or let the community drive development?
In practice, an Open Core strategy often leads to the following consequences:
- Endless debates: Defining the scope of the core vs the one of the proprietary product derivatives is a frequent source of contention both for the vendor's employees or between the vendor and the community.
- Cannibalization of offers: Often there is little marketing distinction between the Core and Commercial Editions, in which case cannibalization normally occurs, to the detriment of the Core.
- Community Exclusion: Vendors tend to favor their proprietary derivatives of community contributions, and shift their focus to value added enhancements rather than the enhancements to the Core.
- Customer FUD (fear, uncertainty and doubt): Customers often desire to buy software based on a free open source community project, but in the end revert to a classical vendor lock-in proprietary scenario. They are uncertain about the future of the Core, its maintenance, its migration path, its upgrades, and finally about its product roadmap driven by the product strategy of the commercial derivatives.
Open Core Main Success Criteria
We can try to solve some of the classical Open Core issues with a series of best practices:
- The Open Core should be of a real utility to the target market audience, and shouldn’t lock further usage or initiatives to the other vendor’s product lines. Open Core is not about confining customers in a closed shell --it's about promoting an open kernel that will help seed other initiatives and attract a long term community.
- It should be evolutionary, despite the initial wishes of the main contributor to follow its own independent product roadmap.
- It should not intentionally degrade stability, scalability or enterprise-grade features to boost conversion rates to commercial offerings.
- Last but not least, there should not be an overt conflict of interest with the vendor's other product lines lest the value proposition fall back into the Freemium/Shareware2.0 camp, despite the presence of the source code.
The value proposition and scope of the Open Core product offering should be clear to all stakeholders. Most importantly, users should be able to foresee a future for this Open Core product beside the extensions, derivatives, or additional lines promoted by the original vendor.
This leads us to the following suggestions:
Suggestion #1: Better distinguish your product branches
First, let’s not confuse Freemium offerings aimed at practitioners with development builds for code contributors or early adopters. The term “community” normally relates to the crowd source collaborative aspects of an Open Source project (free speech), not that it is gratis (free beer). Releasing some community builds does not prevent a vendor from simultaneously offering Freemium editions of its various product families.
So why are so many Open Core software vendors trying to redistribute an unstable version of their product as a promotional resource, in the hopes of converting users to stable and enhanced commercial editions?
Second, most vendors need to improve the transparency of their Open Core strategies. The vendor should clearly state which of its products aim to become a community-driven open source Core, and which it will more strictly control with dual GPL/Commercial licenses, or even more proprietary licensing schema. There is no shame in being a Commercial Open Source vendor in 2010, so vendors should be candid about their position.
Suggestion #2: Keep your Core away from possible conflicts of interest
The second suggestion is to clearly delineate the scope of various product lines, to avoid any long-term product cannibalization. This not only helps to clarify the audience and the scope of features, but also the entire roadmap for each sub-product. The only sure way to do this is avoid all direct conflicts of interest between your core kernel and your product derivatives. Ideally, the Core should have a long-term perspective which encompasses the vendor's commercial derivatives. Organizations, or competitors, should be able to reuse, leverage and extend your core. Co-optition should be made possible.
Common bad practices are the following:
- Enterprise-grade features or scalability limitations. This classical marketing tactic associated with vendor lock-in is not viable in the long run, as it creates conflicts with the community and can finally force the vendor to fork the core and maintain two distinct branches.
- Features required by the community (or even contributed by the community) that are deliberately placed in proprietary extensions (or not committed back into the Core by the vendor).
- Refusal to loosen control of the Core: The vendor keeps full control of the Core and does not grant committer access to any third party.
A frequent underlying cause of these bad practices is the lack of clear product boundaries vis-a-vis the Core, leading to severe conflicts of interest.
Suggestion #3: One size does not fit all
Now that Open Source business models are better understood by developers and customers, more vendors are using various open source strategies as part of their product families.
For example, one could combine a community-based Open Core released under a business-friendly license, associated with some hybrid GPL/Commercial derivatives, combined with some other proprietary extensions.
The “purity” of an open source vendor no longer has much meaning in 2010. Rather, we should speak of the purity of a given open source project, be it a Core, a library, or an entire product line. As vendors continue to adopt more hybrid strategies, the various levels of “purity” should be assessed for its particular product offerings. 100% pure Open Source vendors should start exploring various licensing models and apply them distinctly to each of their sub-products. As a result, these vendors will develop a more global and valuable Open Source business model. Customers will have to understand a vendor’s entire open source strategy before rushing to deploy the core or on another sub-product.
This trend underscores the need for vendors to avoid Open Source FUD to their customers. The company's open source business model should be rather simple to explain, and clearly state the value proposition for all the stakeholders.
We can summarize this chapter by listing the following key points:
- Open Core vendors tend to create confusion by mixing their Community with their Freemium editions, and their stable with their development branches.
- Most Open Core vendors do not clearly delimitate the kernel from their product derivatives. This usually creates severe conflicts of interest.
- Open Source vendor purity has largely lost meaning in 2010. Purity should be assessed on a project-by-project basis, and the vendor's entire open source business model should be clearly communicated to customers.
3) Applying the Open Platform paradigm to the CMS industry
Let’s now try to combine the first chapter, Future of CMS, with the second one, Future of Open Source, to envision how Open Source CMS could evolve over the next few years:
- Composite Content Servers, Content-Enabled Applications and Content Management Products will be better differentiated, and split into distinct product lines.
- An Open Core CMS strategy makes sense, especially if the Core becomes the Composite Content Platform. Such an Open Core strategy will avoid long-term conflicts of interest with the Content-Enabled Applications or the Content Management Products delivered on top.
- The community of Open Source developers always tends to favor infrastructure and middleware initiatives on finished and ready-to-use software products. We should therefore assist the fast rise of new hybrid business models with some open-sourced Composite Content Servers for techies, released under a business-friendly license, combined with dual-licensed or even proprietary content applications and content management products for practitioners. It is wise to separate communities of techies and practitioners.
Applying our conclusion to the Open Source CMS industry, we can now try to draw a general picture of future business models:
Of course, each CM product and vendor is different, so there will certainly be hundreds of heterogeneous variations of this business model over the next few years. But the underlying paradigms should be pretty similar.
I will further detail each of these major paradigms in future blog posts. Meanwhile, please do not hesitate to add your thoughts and comments below.