Nov 15, 2006
If you have been to a content management conference in the last year, you might have seen one of Tony Byrne's 'CMS Idol' or 'Web Idol' competitions. If you have, you know they are a nice excursion from the typical vendor expo. If you haven't, the basic premise is similar to the American Idol television program but instead of singing and dancing (thank goodness) the vendors do short 6 minute demos of their product. Like the television show, there are three judges (I once played Randy, dawg) giving commentary but the audience decides. Tony used to do a regular conference session called something like 60 seconds with a vendor where he would ask questions from the audience. That was good but this is even better because it is visual and fun.
Web Idol is also entertaining if you like rooting for the underdog because the big heavy weights of the industry rarely do well. They try to do too much with their demos to show their breadth and, on any one feature, they are not necessarily better than a simpler product with a well defined niche and understanding of the problem that they are trying to solve. Another factor is that the small players really gear up for these things while the big guys often don't even participate because the downside of losing is way worse than the upside of winning (people expect a product that costs 10 times as much to be better). The arrogance that the "industry leaders" have doesn't help either.
Winning Web Idol does not so much mean that you have the best product (there is no best product) but it does indicate that you have connected with the audience and shown a solution that they can visualize using within their organization.
Even with that disclaimer, I still think it is very cool that eZ publish beat out FatWire, SiteCore, Terminalfour, and Tridion to win Web Idol at the cmf2006 in Denmark last week. This is not a 1980 U.S. Olympic Hockey Team class upset although there are probably some people out there who think it is. People who have not seen open source content management software frequently believe these products to be developed by programmers and for programmers with very poor usability. Usability and simplicity are the key factors in Web Idol and the cmf2006 audience was clearly impressed. So, kudos to the eZ team for a great set of demos and removing some of the fear, uncertainty and doubt about open source technologies.
Nov 08, 2006
Rick Shreves just announced the availability of a new book on Mambo: Mambo Visual Blueprint

. This Wiley Visual Blueprint series book is the third on Mambo (following two Packt books: Building Websites with Mambo

and Mastering Mambo: E-Commerce, Templates, Module Development, SEO, Security, and Performance

. I have not yet read it yet but if you like the "Visual Blueprint" style, and you want to learn about Mambo, you might want to check this book out.
For those of you keeping score, that 3 to 2 (if you count the Robert Deutz's German only Joomla! book), Mambo over Joomla!
Nov 08, 2006
My colleague, John Eckman, has an excellent post about a recent digital fracas on what it means to be a Web 2.0 company. John summarizes a blogument between Lawrence Lessig and Nick Carr. While the discussion falls short of accusations of fascism, there is a considerable amount of comparison to Maoist philosophy. Fun reading for those of us who appreciate a good intellectual dust-up.
Nov 02, 2006
Frequently the terms "Internationalization" or "Localization" (abbreviated to "I18N" and "L10N") are found on requirements or Requests for Proposal documents. While companies typically are under-prepared to fully support a localized website, it is good that they thinking ahead to when they are ready to reach out to these different markets. Unfortunately, too often I hear localization talked about in binary terms. As in, "does this product support localization?" Or "Should we localize this site?" In reality it is not black and white - just many gradations of gray.
Faced with the similar problem of in determining whether a web site is "accessible," the World Wide Web Consortium (W3C) Web Content Accessibility Guidelines (WCAG), came up with a three tiered structure of priorities that range from "must" have to "may" have. This allows people to qualify just how accessible a site claims to be. There are many similarities between accessibility and localization. After all, when we talk about localization, aren't we really talking about accessibility for people with different languages and customs? In both cases:
-
We are trying to reach out to an audience that is presently not able to access the content.
-
There is a cost-benefit trade-off as to how far we go to serve these audiences. Hopefully, this will not create a big ethical debate but it all depends on your audience and what their capabilities and sensitivities are. Note: if your website works well with a screen reader, it will also work well with a search engine spider. So you don't have to care about social responsibility to care about accessibility.
-
You might be mandated to serve a certain audience. In the U.S., there is Rehabilitation Act Section 508. In the Canadian government, publications must be in English and French.
Surprisingly, there are no guidelines or evaluation criteria for localization. Not until now...
Later this month at the CM Professionals Summit, I am going to hold a round table to get feedback on these Levels of Localization. If the session is productive and we reach alignment, the working group will propose this to be a CM Professionals endorsed set of guidelines on localization. To get more feedback, I am going to post some initial ideas here.
Before I go too much further, it will be useful to define what I mean by localization. Localization means supporting specific alternative locales (geographic regions distinguished by language, government, and custom). Localization can be (and frequently is) part of an Internationalization strategy of reaching broader audiences and interacting in a global marketplace ("Globalization" is usually used to describe an economic process of regional economies merging into a global economy). Factors that go into the localization process include:
-
Text (including text on images) translated into local language and dialect
-
Prices and other money references converted into local currency
-
Formats (such as date and time) displayed according to local conventions
-
Weights and measures converted into locally accepted units
-
Culturally appropriate imagery and colors
Not all websites and content are worth going to all that trouble for. Here are some intermediate steps that might serve the goal of increasing accessibility to new user groups but fall short of the true definition of localization.
-
Level 1. This level assumes that the audience is proficient enough in the site's primary language to be able to navigate a site and find what they need. However, the specific assets that they seek are critical enough or the detail is important enough that they would feel more comfortable accessing them in their local language. Sites that reach this level have the following characteristics.
-
Selective translation. While all of the main site components (navigation, header, footer, etc.) and the bulk of the content are uni-lingual, certain important assets are translated into alternative languages. Typical examples of selectively translated content include product manuals and downloadable forms. A news agency might have a feed of news that is written in or translated into a localized language.
-
Transactive accessibility. Pages that require user input should be usable by the target locale. For example: different phone number and "postal code" formats, neutral designation for address fields (province vs state), and (possibly), double byte characters.
-
Level 2. This level is achieved when the user is able to select a language and whole site (including navigation and buttons) is presented in the selected language. All that is available in the primary language should be available on the localized languages. However, in cases where content is not translated, there should be a fall-back mechanism that notifies the user that the asset is not available in his selected language and provides access to the primary language version. There are many subtle nuances with this behavior that should be fully specified and understood. For example, when the site falls back to the primary language of an asset, does the whole site switch over as if the user selected another language? Or does the body of the primary language content appear within the selected locale's navigation? When a user selects another language, does the site refresh to the home page or to the localized version of the content currently being viewed? How divergent are different localizations of a site allowed to become? When a new version of one translation is published (or reverted), what happens to the other translations of that same content asset? Are they un-published until they are updated to reflect the new version of the primary translation? Is anyone warned? Or are they left up there with the potential of being out of sync? Level 2 localization is usually supported on a single site instance by the CMS's internationalization functionality that maintains relationships between different translations of content, remembers the users locale selection, and provides a translation framework for static text within presentation templates (for example, the word that says "search" on the search box). Usually the CMS's localization framework will have specific philosophies on these nuances so it is important to understand how localization is implemented.
-
Level 3. The highest level of localization has all of the characteristics of the localization definition described earlier in this article. This level requires a balance of power and control between local management (that knows the local audience) and corporate headquarters (that understands the global strategy and vision of the company). Sometimes it is difficult to tell the difference between full localization and a chaotic collection of renegade foreign offices. If the appropriate balance is achieved, local audiences receive information that the international company wants to communicate but delivered in a way that the local branch wants to express it. Out of balance, different audiences will receive intentionally or unintentionally conflicting information. Achieving this level of localization is largely a governance problem. Local variances need to be explained to and approved by headquarters in a business process that does not unduly obstruct the local divisions need to conduct business. There needs to be a communication mechanism so that global and specific changes desired by corporate are communicated to and executed by the localized sites. These same forces complicate many other aspects of managing multi-national companies. Sometimes it is hard to shield customers from these dynamics. I am still waiting to see a perfect example of a single instance of a CMS supporting this complex network of control. In general, local business units require a certain degree of inde
pendence and agility that is difficult to achieve on a single, centrally managed platform. However, there are things to look for in a CMS that will help matters:
-
A solid notification system. When something changes the appropriate people need to be notified of the nature and impact of the change. This could be workflow based email notifications to external users. It could be cross-system workflow where the primary system initiates a task on a secondary system.
-
A localization aware reporting system. In order to achieve adequate governance, it is necessary to understand how content is being translated and syndicated to localized sites. There should be some way for the system to know if a newly published content asset has been localized. This could be achieved by a post-publish workflow state. Web log analytics are useful here too. It is important to know of broken links coming from localized sites.
-
A replicatable platform. While it is unrealistic (and frequently undesirable) to require every local business unit to use the same CMS, synergies may be achieved if a corporate standard were made available for localized sites. Presentation templates can be re-used and customized for a local market, users required to work on multiple systems will have less technology to learn, and it may be easier to orchestrate cross-system workflow (on the last point, lock-in risk might be mitigated by integrating with a third party workflow engine). More importantly, the software acquisition costs should make it affordable to distribute the tools. This also holds for the case of when a local site manager needs to log into the primary CMS to access pre-published content. These occasional users need accounts and it should not be cost prohibitive to provide them.
Hopefully, these Levels of Localization will introduce some much needed terminology to the discussion of an important topic that companies are increasingly considering in their CMS implementations. I am sure that a continued dialog will refine these ideas and drive toward a new standard that helps companies understand their real localization needs and be able to communicate them in the specification of a CMS. If you find this concept interesting, please send feedback. Or, better yet, join me at the CM Professionals Summit!
Nov 02, 2006
I feel like one of my basketball pool teams just got eliminated. Oracle just announced that it intends to buy Stellent. I had them get scooped up by HP. In addition to the disappointment of a failed prediction, I feel a sense of doom for Stellent's evolution in the field of usability. If they are lucky, Oracle will leave the product alone. Otherwise, content authors should get used to command line tools in much the same way all real Oracle database administrators still love their SQLPlus.
Oct 30, 2006
Lisa Welchman has a great post on CMS Watch Trend Watch that describes the phenomenon of Web Content Management System buyers seeing a CMS as just a wrapper around the WYSIWYG editor. I can't even begin to say how true that is. Recently, a client was having his first look at the WCM system that we were implementing (after not participating in any of the prototyping or reviewing the incremental builds that that we had been doing over several weeks) and he left me a voicemail saying "the administrative interface is all wrong. We have major problems." As you might expect, I was very concerned. Anyway, it turned out that all of his issues were around the WYSIWYG editor. To him, the WYSIWYG editor was the CMS. It turned out that it was misconfigured and everything is OK now. Phew! That would have been bad.
Way back when, I remember the debate over WYSIWYG editors between the CM purists and the user facing pragmatists. The purists didn't want any markup (formatting) in content and the pragmatists were trying to appease users who wanted to make web pages with as much control as writing a document in Microsoft Word. The compromise was to give users free control over small portions of the page. However, given the attention that the editors are getting, it appears that the balance is shifting.
I think this is natural as WCM goes "down market" to run small web sites that had at one time been just static HTML. Small websites have fewer authors and less content so they do not need as much centralized control or content reuse. Strict adherence to content management best practices is less critical. All that is needed is a reduced dependence on the HTML literate webmaster. The CMS becomes more of an HTML editing and deployment tool. Ironically, the HTML savvy webmaster that managed the static site frequently becomes the sole user of the CMS. I would argue that this is not true content management. But maybe not all CMS buyers need to manage content. They just need to manage their website. Still, if you are spending hundreds of thousands on a CMS, presumably you have real content management needs and you should be looking at more than just the WYSIWYG editor.
Oct 29, 2006
Recently I have been doing a lot of talking (and listening, and reading) about the challenges and strategies of selecting a CMS. On October 25th, Bryant Shea and I hosted Tony Byrne and Erik Hartman in a discussion about CMS selection on the latest installment of the Malcontents. Then, later in the day, we had a Massachusetts CM Pros Chapter Meeting on the same topic. That, along with countless one on one conversations I have had recently, makes two very obvious points: many people are out there trying to acquire a CMS; and many are struggling with this task. I have written on this topic in the past but it has been a while so I thought I would put some ideas down.
There are a number of reasons why selecting a CMS appears to be particularly challenging:
-
The CMS market is large (as in thousands of products) and chaotic with no clear separation between winners and losers. In fact, during the podcast, Tony made the point that there are many CMS vendors that are doing very well rather than a divergence between laggards and leaders.
-
Especially in Web Content Management, the software is really just a framework. If everything is configurable or customizable, how do you evaluate what you are looking at? A couple of years ago I bought a bike where the frame was custom built for me. As I was test riding different models, I was thinking "what is the point? Everything I like or don't like about this bike (every tube length, diameter, thickness, and angle) will be potentially different in what I will buy." Furthermore, in the case of a CMS, it is not always obvious where the "load bearing walls" are. That is, aspects that cannot be changed without compromising the product or major re-engineering.
-
Every potential customer's content management needs are slightly different. Content management is a ridiculously large topic to begin with. Even within a sub-discipline like Web Content Management, different companies will have different problems and different perceptions on how technology can address them.
-
Likewise, CMS vendors also have their own perspectives on the "content management problem" and how to solve it.
-
Most CMS products are reaching out to solve problems outside of their sweet spot. Either they are building capabilities to solve problems that they do not fully understand, or they look at the world through such a strong lens that everything looks like the problem they know ("if all you have is a hammer, everything looks like a nail")
-
Many companies looking for a CMS now are doing so because of one or more failed CM initiatives and, rightly or wrongly, have blamed technology. This raises the stakes in the selection considerably.
-
Solving a content management requires a very close partnership between the technology organization and the user community. Most companies don't have the right chemistry between these two groups. Typically, regardless of what is said, I.T. owns the software and the business owns the processes. In content management, it is impossible to separate the two.
-
Usability is such a critical factor in CM success. Bad usability causes users to either under-use or misuse the software. Most I.T. organizations practice software development methodologies that are not geared to usability. How many project plans have you seen starting with textual requirements gathering (that give no indication of what the software will be like to use) and ending with a short period of "usability testing" or "user acceptance testing?" The software development team leaves with a spreadsheet and returns with the final product. At this point, it is often too late and too expensive to make changes. Word to the wise: don't do usability testing if you are not prepared to respond to the feedback.
-
At any one time, most I.T. organizations are faced with multiple business units that are asking for tools to help solve content management problems. There is a natural tendency to try to buy one solution that answers all of these requests - no matter how diverse the needs are.
If you made it this far, you probably feel doomed. How can you be successful if you have this much working against you? Well, here is some advice that will make your life easier:
-
Cancel your Gartner subscription. There is no magic quadrant in CMS. What is important is your users and your content. The right product for another company, even in your industry, even another business unit in your own company, may not be the right product for you.
-
Don't focus on the feature lists. The software market has regressed into a conversation of checklists. Software vendors try to "check off" features that they think their customers will want or that their competitors have. Customers reading vendor brochures start to believe that they need these features. It's an ugly cycle that leads to bloated software and unhappy users.
-
Do focus on the "How." Rather than ask "does your product have this?" (you know the answer will be "yes"), ask "how could I use your product to do this task or achieve this goal?" This starts a dialog that really tests the mutual understanding of the need and the solution.
-
Use Scenarios. No matter how you define scenarios, they are a good thing (I can't believe I just wrote that). Seriously, I have heard different experts talking about using scenarios in different ways and they all seem to be good advice. The traditional term of "scenario" in software development is a flow through a use case. That relates to my prior point on "Focus on the How." Walk through usage scenarios of executing a specific task in the context of a particular product. When James Robertson says "scenario", this is what he is talking about. The latest version of the CMS Report, and also a CMS Watch feature article, discusses 12 "universal usage scenarios" ranging from a corporate brochure site to multi-channel publishing. I have taken a similar approach in a white paper on evaluating open source WCM systems. This is what I would call a "macro scenario" which describes a case of an organization with a particular need.
-
Prototype, prototype, prototype. Don't make a leap of faith based on a canned demo and a bunch of promises. While generic demos might help you narrow your list of candidates to a short list, before putting your money down, build a prototype with your content and have you your users bang on it. In some cases, especially when dealing with open source (where there is less potential for collecting licensing fees), you will have to pay for this prototype. However, the investment is worth it. If you feel embarrassed or not worthy to ask for this from a vendor, you are talking to the wrong company.
-
You are acquiring more than software. While the software itself is certainly important, it alone will not solve the problem. The software comes with an ecosystem that can either support you or steer you down the wrong path. You need to be comfortable with the systems integrator that will translate the software into your context, the customer support organization (or community) that will bail you out if you get out of your depth, the product management that sets the vision of the product, and numerous other stakeholders. In my (probably ineffective) bicycle example earlier, I was confident that I would get what I wanted because I agreed with the bicycle builder's philosophy in bike design and the sales person's understanding of
my needs. I am really happy with the results.
-
Understand the user community. Ideally, you may be able to build relationships with other like-minded customers who you can learn from. Knowing the customer base may give you an idea of how the product might evolve in the future. Understand who is using the software and what they are using it for. While this is easier to do in open source, you also may be able to do this with commercially licensed software.
All of the advice that I have been giving so far has been qualitative and requires deep and active evaluation. You just can't do that for 2,000 vendors. So how do you get to a short list? There are a number of resources that will be helpful. Join an organization like CM Professionals or AIIM to get in touch with other users that have done what you are trying to do. CMS Matrix is good to get filtered lists of options although the accuracy of the reviews is very subjective. The CMS Report is a great resource that presents the product landscape and very sound advice on executing your CM initiative. CMS Review has a lot of information but some of it is dated. Bob Doyle, editor of CMS Review also has a nice and simple article in EContent called CMS Select a CMS in 15 Steps. Like a lot of good advice, it is pretty much common sense. No matter what you do, get yourself out there and start reading and talking to as many people as you can. Don't rely on a software sales team to teach you everything you need to know about content management. Talk to some independent third parties. Use the product in your own business context. Get a feel for what the company is like to deal with after the contract is signed.
Oct 27, 2006
There has been a great thread on the CM Professionals mailing list about automated tools for migrating content. When the same topic was discussed a few months ago, the general consensus was that the hope of effortlessly migrating into a new CMS (either from an old CMS or a static site) is unrealistic. A few people made the point that that limitation is not necessarily a bad thing because there is value in evaluating every piece of content as it is moved to the new site. Slurping out of one website and spitting into another is likely to simply move the mess rather than achieve the goal of creating a more useful and manageable web resource.
The dialog was re-energized today when a new member introduced himself as working for a company called Vamosa. Automated content migration is what they do. The process that Vamosa practices is not a simple turn-key process. That was reassuring because I know that if that is what he claimed, he would be lying. A typical Vamosa project takes roughly 3-4 months. It consists of measuring the gap between the state of the current content and the target system, creating rules for parsing content and mapping, then, once everything is set up to go, the system can migrate 10,000 pages per hour. I assume that it takes several tries because that is my experience even for simple relational database migrations.
This seems like an interesting service to look into. Some legacy systems may be more suitable for this solution than others. For example, if the target is highly structured, and the source has no structure and there is no uniformity of layout on which to base parsing rules, prospects would be dim. However, if there is uniformity and structure, there would seem to be potential.
Other posters in the dialog had ideas for other tools that would help in content migration. For example, if Microsoft were to make a version of Word that was more of a content entry tool than a desktop publishing/layout tool, there might be hope that good structured, XML content could come of out of it. I hear that Information Mapping's Content Mapper product does just that although I have not used it yet. Another email talked about the need for tools to plan and manage the content migration process. That sounds interesting too.
Note: If you are a CM Professional and saw a point that you made on the mailing list in this blog post and want attribution. Please email me and I will quote and attribute. I just kept it anonymous for privacy reasons
Oct 20, 2006
The CM Professionals Massachusetts Chapter will be holding a meeting at Molecular at 7PM on Wednesday, October 25th.
Here is the agenda:
-
Chapter updates including the CM Professionals Summit. [BTW, I just got an update on the Summit at this morning's board meeting. This is going to be the biggest Summit in the history of the organization.]
-
A round table discussion on selecting a CMS. [Bryant and I will be having a Malcontents podcast on the very same topic. Stay tuned. ]
-
A discussion of a new content management Lifecycle Poster. [For those of you just tuning in, there is a long history on this topic. Content management is a difficult concept to describe. Having a picture is tremendously helpful to express the many facets of managing content. Putting together the picture is hard because it forces consensus from many different perspectives]
I am going to be there. Are you?
Sep 27, 2006
[Authors Note: This is a derivative work of a white paper written by me and published by Optaros called "Content Management Problems and Open Source Solutions." For more information on the original report, read here. Rather than try to republish the whole report, I am going to post updated reviews here. The general concept is the same: group systems by how they are most frequently used. Describe them in terms of Content Structure and Editing, Management, Presentation and Layout, Community and Support, Roadmap and Vision. I also rate things like resources available. The ratings are still in terms of below average (), average (), or above average(**). I am also trying something with microformats in my brief assessments of the resources available. Next up, will be Magnolia which I have been wanting to review for a long time.]
| |
Version Reviewed | eZ publish version 3.8.3 |
Most common use |
Informational Brochure Site |
Also used |
News site, eCommerce |
Architecture |
Lamp |
Organization |
Development controlled and supported by eZ systems |
License |
Dual GPL/Commercial |
Resources |
Books: |
This book is published by eZ systems and is the basis for their training and certification program. It is easy to read and very strong on the basic concepts. However, it is thin on advanced development topics.
Learning eZ publish 3: Building Content Management Solutions. Rating:***
This book is somewhat outdated. It is targeted toward a more technical audience and addresses more advanced topics.
Training:
The certification program is new but off to a good start. The test was hard.
Online Documentation:
Much improved over earlier versions. Good API reference.
User Forums:
Very active and helpful.
Overview
eZ publish is a commercially-supported, dual license web content management system written in PHP by Norway-based eZ systems. The GPL and commercial licensed version of the software are identical. Most end users can use the GPL version unless they intend to resell or OEM the software. eZ publish has many reference sites including ecommerce, educational, interactive media, and corporate web sites. Development of the core product is closely managed by eZ Systems with the community contributing extensions and patches that may later be incorporated into the core product. eZ publish does not have the grassroots development community that Joomla!, TYPO3, and Drupal have. Perhaps this is because this is a commercial open source project, not a community driven project. However, In addition to eZ systems, which licenses and supports the product, there is a large partner network that provides integration and training services. eZ systems is also having an impact on the general PHP community with their introduction of eZ components which competes with the new Zend Framework and more established frameworks such as Symfony and CakePHP as a standard framework for building PHP applications. If eZ components is successful, the population of developers that are familiar with the technology concepts behind eZ publish, such as the templating framework, will increase.
Architecturally, eZ publish is well designed and extensible with a nice separation between business logic (implemented as modules) and presentation (implemented as views in eZ publish's templating language). There is good support for pure content management features such as versioning, localization, and single sourcing. In addition to the content module, which handles all the content management services, and other modules, such as the user module and the web shop module, developers can create custom modules to work within the framework. The contributions section of the eZ systems website has a decent library of modules for free download.
The administrative user interface can be a little overwhelming for non-technical, un-initiated users. This may be a reflection of eZ publish's purist approach to content management with notions of content classes and objects and node hierarchy, strong versioning, and translation. The administrative interface definitely makes the user feel like he is managing an application rather than just editing a website.
Content Structure and Editing
eZ publish ships with several basic content types such as articles and various binary formats and as well as complex content types, such as a company (that could be used to support a feature like a partner directory) and a product. Content types can be defined at run time through the management UI by creating classes from attributes based on roughly 30 pre-defined datatypes (such as text, selection, xml block). Each of these datatypes has an associated editing widget so the content editing forms are built dynamically based on the composition of the content type. Certain datatypes can be designated as information collectors which allows a content class to represent form that external users can use to submit information (like a contact form). Collected information is stored in the database and is accessible through the management UI where it can be exported to various text formats.
The XML block datatype is used to store user formatted text for free-text areas.
Content stored in these attributes is actually stored an XML markup rather than plain HTML. For example, rather than using an <H1> HTML tag, eZ publish requires a user to use <heading level=â€1â€>. The Online Editor (now bundled with the basic eZ publish package) gives the user a WYSIWYG form control so he does not have to be aware of the XML based content behind the scenes. There are a couple of trade-offs associated with this design. On the negative side, it locks eZ publish into a single editor whereas other CMS can use any of several open source, or even
closed-source editors. Standard HTML editors allow the user to turn them off and paste complex HTML from other sources directly without an editor interpreting and possibly interfering with the original source. Still, the Online Editor seems stable and handles pasted text reasonably well. Also on the negative side, all this XML transformation may contribute to eZ publish's hunger for computing power. On the positive side, storing content in more structured XML increases the system's understanding of and potential to reuse the content. For example, eZ publish is aware of every link entered into an XML block field and has a features which checks for internal and external broken links.
In eZ publish, a page on a website has two components: the node or location, which places the content the site hierarchy; and the actual content object that stores the content. This design allows a single piece of content to be used in several places on the website. This feature is very useful for situations like having a news folder with all the news articles and then promoting certain articles on the home page. Another use is to have content type which represents a "promotion" and placing that promo on different areas of the site. Content objects are not associated to nodes until they are published. This can be confusing because it means that unpublished pages do not appear on the page tree of the administrative interface. They appear as drafts on the user's my drafts page.
Management
eZ publish does not support an in-site editing model where a user can navigate through the site and click on an icon to make an edit. Instead, eZ publish has a management interface that is standard (although it can be modified, most customers do not) and is kept totally separate from the visitor facing website. The management interface is what a system administrator would expect to see in an administrative interface: lots of control and an engineers logical view of the content. However, this view goes over the head of a non-technical, untrained content author or editor. There is the capability to turn off sections of the user interface but all those sections are really needed to navigate and edit the content. If you deploy eZ publish, expect to train your non-technical users on a bit of content management theory and how to execute basic tasks. Once the initial learning curve is conquered, they will probably appreciate the control that the UI affords.
The architecture of keeping the management interface separate from the visitor view is controlled by a system called "siteaccess." Each siteaccess is a set of configurations that control what design is used and what content is being managed. Typically, a website will have one siteaccess for the administration interface and another siteaccess for the visitor facing website. These two siteaccesses share the same content but have a different design templates. The same mechanism could be used to support several co-branded websites with the same content or several totally independent websites each with their own administrative interfaces. Some of the major hosting companies that deal with eZ publish have many websites running on the same eZ publish instance.
eZ publish has a very strong versioning and translation system. Each content object can have different versions (called "drafts" until they are "published" and "archived" if they are replaced by a newer published version). Content can be rolled back by copying an older version forward and publishing the copied version. By default, eZ publish will save 10 versions of a content object but that setting is configurable. Each version can have one or more translations. In this way, different translations of the same content object can be kept in sync. Different language versions of the same website are controlled by the siteaccess framework in which a primary language and fall back languages are set.
eZ publish comes with a very basic two-step workflow that can be extended by adding events (written in PHP) that affect the state of the asset and also can trigger additional functionality. Extending workflow is generally considered one of the more challenging customizations to make in eZ publish. Most customers stick with the basics.
The eZ publish access control system is very powerful and secure. Access is controlled by granting roles (bundles of policies which grant access with various limitations) to users or groups. Policies can be associated to sections of the site or branches of the node hierarchy. Groups can contain users or other sub groups.
Presentation and Layout
For the most easy to satisfy customers, the installation process includes a set of pre-built themes that can be modified through the UI (such as the type of navigation style, etc). However, most customers (especially if they are using eZ publish for their corporate identity site) will want to do in depth customization to the presentation templates. The method of doing this is to follow a model of template overrides. When rendering a page, the templating engine looks in the design folders specified by the siteaccess for the appropriate template and then falls back to the standard design if the appropriate templates are not found. Changing things like the overall page layout involves copying templates from the standard or other designs into the appropriate design folder and making modifications.
There is also a system for creating rules that determine what templates are selected. These rules can be based on the content type (or datatype), the section of the site, the original template that would have been used, or the individual node id. The rules are defined in a configuration file. Although the management interface theoretically could control these rules, it is so unusable that it is best to just edit the file.
The actual syntax that is used to develop templates is based on the Smarty PHP templating engine. The general concepts and much of the syntax are similar but, at the time eZ systems was considering Smarty, it was too primitive for their needs so they pushed ahead with their own templating language. Since that time both Smarty and eZ publishes templating system have evolved, sometimes in similar ways. If a developer knows Smarty, eZ will look very familiar. In fact, if you have Smarty based text highlighting rules in your text editor, they will work nicely with eZ publish templates. The templating engine is also part of eZ components so it will compete with Smarty to be used in other open source projects or custom web applications. The eZ publish templating framework rigidly separates layout from application logic. There is no way to write PHP code in a "scriplet" as can be done with some other templating frameworks. Anything resembling programming logic must be created as an "operator" which can be registered with the templating framework. eZ ships with most of, if not all, the operators you need to build a site such as comparing values, converting strings, and extracting data from content objects. Tags like "foreach" and "if" dictate control flow.
eZ publish manages two sets of URLs: the system URL, which is of the format [module]/[view]/parameters (for example, /content/view/12 to view node 12) and the search engine friendly "virtual URLs" that are automatically managed by the system based on the name of the node (example: "/about") and manually managed by manual overrides (this is handy if you are migrating a site onto eZ publish and your URLs change).
The presentation supports three level of caching: view caching, template caching, and static caching. View caching caches the output of the view (usually in the central well of the page), template caching stores compiled PHP code (generated from interpreting the templates) that is optimized by using "cache-block" statements in the page templates. Static cache stores rendered static HTML on the file system
for rapid access. When you configure static cache, you tell it under what conditions to refresh the cache. Additionally, eZ publish (for good reasons) recommends the use of a PHP accelerator, such as APC, that stores PHP code in memory and greatly improves execution latency caused by disk I/O.
Turning off caching (necessary if you are doing development) reveals why there is so much caching built into the architecture. eZ publish is slow. Several factors contribute to this: the abstraction of the templating framework; conversion of XML blocks to HTML; the highly normalized and complex data model which supports user defined content classes, versioning, and translations; and the template override system. Just because eZ publish is written in PHP, do not think that it can be run on an inexpensive $20 per month virtual Linux server.
Still, caching, plus eZ publishes support for clustering (enhanced in version 3.8) make eZ publish capable of supporting very high traffic websites.
Community and Support
Over the past few months, the documentation, especially for the 3.8 release has improved dramatically. However, if you get stuck, the best place to start with an eZ publish question is the forums, which are extremely active and have a searchable archive. The lists are moderated, so almost all questions get answered, although sometimes there is a delay between the question and the answer. There are several extremely active forum contributors that are credited with hundreds of posts. eZ systems also leverages blogs and RSS for community outreach.
Recently eZ systems has stepped up its presence in North America. After establishing an office in Vancouver and seeding it with some senior staff, eZ systems is building its North American partner program with lots of outreach and programs including training and partner events. eZ systems looks to partners for system implementation/integration work and bases its own revenue on support and value-add modules that it sells. eZ systems sells a network product that can be installed on a production instance to check for compliance, monitor its health, send updates, and be used by eZ systems support for issue diagnosis.
Roadmap/Vision
eZ systems follows a release schedule of every 6 months. The most recent release was mostly architectural including better support for clustering. The next major release will be more features based. eZ publish still does not support PHP 5 and there are no immediate plans to do so. However, this is not so much of a problem as eZ systems employs one of the top committers on the 4.x PHP series.