Aug 28, 2013
Over the years I have seen well over a hundred web content management system demos. I have even done a few myself. But it doesn't take watching many demos before you notice that they all follow the same general patterns. "Watch me change this page." "This is how I find the content I was looking for." "Doesn't this demo website look nicer than yours?" OK, the last one is more implied than said. In this era of Web Experience Management, there are some new elements to the story. "See how this page looks differently to different audiences with different intents on different platforms?"
Don't get me wrong. I still get excited by the possibilities of technology that can show a visitor the perfect content in the ideal format. The problem is that achieving this goal assumes that the user knows what the perfect content and ideal format are. But we don't — not by a long shot. The best we can do is preview a page pretending to be a visitor; and that takes a lot of guesswork.
The missing element in this story is the visitor. That is significant because the visitor is the heart of WEM. After all, the "E" in "WEM" stands for the visitor's experience, not the experience of the content editor previewing the page. How do we get the visitor into the center of the demo's narrative? It is hard to switch back and forth between the visitor and the editor without making the demo appear choppy and disjointed. Besides, as I said earlier, we usually don't know as much about the visitor as we would like to.
Perhaps a better way would be to use data to represent the visitor. Having done demos, I know that showing transactional data (like web traffic) is a real challenge. You need to have lots of recent data to make the demo look realistic. This is why the analytics segment of a WCM demo usually falls flat. The demo environment usually just has a few hits — not enough to see any real trends. That said, wouldn't it be great to go through a scenario that begins with realistic visitor traffic data?
I could imagine a story that started in an analytics area with the identification of an under performing section of the site. Perhaps, you are getting search traffic on the wrong key words. Maybe a lot of people get to the page but then you loose them. Maybe you dig a little deeper and you notice particularly low conversion rates within what you thought was a high potential audience segment. At this point you could show how you reconcile audience segments between the analytics package and the WCMS personalization engine. That is, the CMS might have a "big spender" visitor profile, but how do you see that same population on the analytics side?
From there, you might validate that these visitors are seeing what you configured them to see. Going deeper, you might notice that "big spenders" are predominantly coming from mobile devices. You preview the personalized page on a mobile emulator and, BAM, the answer hits you in the face. The graphic that you thought would be so compelling to this group doesn't scale very well on mobile and makes the page unusable. Only then do you start editing content.
There could be many variations of this story but the key point is that the platform is helping solve the biggest obstacle to engaging visitors: understanding them. I would argue that unless you understand your visitors to this level, you shouldn't even touch personalization functionality. You are not engaging visitors; you are stereotyping them and then ignoring them. You may even be making their experiences worse because you haven't really tested what they will see. You are flying blind. What kind of business value is there in tool that facilitates unproductive tasks? I would argue less than none.
Aug 29, 2011
I just published my CMS Selection Workshop handout on Scribd. The handout contains:
CMS Selection Handout
Nov 21, 2008
I am finally going over my jboye08 notes and I found my scribbles from the Web Idol competition. For those of you who are unfamiliar with this event, each vendor gets seven minutes to demonstrate their product. "Celebrity" judges put in their quips (á la American Idol) and the audience votes on what product they like the most. In addition to being entertaining, the event gives insight into what the vendors think are their coolest features (they can't show everything in 7 minutes) and what the audience responds to. Here is what the vendors demonstrated this year (in order of their appearance).
SDL Tridion showed:
eZ Systems showed:
- database integration through the administrative UI
This year Sitecore won first place and Hippo came in second. Interestingly, reigning champion eZ Systems did not show all the video functionality which helped them win last year. The audience seemed to be the most attracted to clean user interfaces that looked simple to use. Advanced functionality like sophisticated workflow and database integration were less compelling. While an event like Web Idol does not translate into a software selection, I think this result reenforces the importance of simplicity and ease of use in a demo. Power, range, and flexibility gets a product onto a short list but simplicity is what business users find sexy (at least as far as software demos go). If you are running a CMS selection, this means that you need to make sure that all of the products that demo to your selection team meet your core functional and non-functional requirements.
Feb 20, 2008
There has been an interesting thread on the CM Professionals mailing list discussing the efficacy of an RFP. Many participants cited frustration with an RFP process that wastes people's time with unnecessary formality and the pretension of an even contest. I think everyone who has been in business has witnessed the act of an RFP being distributed after the contract has been awarded. The RFP process, as it is commonly practiced, suffers from four major flaws:
- Buyers make their choices harder by forcing suppliers to submit identical proposals. I hate the expression "comparing apples to apples." What if you would prefer an orange? You want the vendor to show their individuality.
Vendors are suspicious of the RFP process and try to limit their exposure by expending energy qualifying the deal and their chances rather than investing in their proposal. You don't get anything but canned demos and copy-paste responses until you have reached a short list.
Companies with good products are too busy to respond to blind, widely distributed RFPs. The issuer of these RFPs tend to be flooded by responses from marginal companies with struggling products.
- The RFP process is, by nature, adversarial and not a good way to start a partnership. Imagine finding a spouse with an RFP.
That said, you need some kind of process to keep you from falling in love with the first pretty user interface you see. There are some good aspects of an RFP that are worth keeping. You need some kind of document that communicates your requirements and your selection process. It is important that all the candidates have the same baseline information to tailor their sales process. My process for selecting a CMS uses an RFP in this way. I don't send an RFP to more than three vendors that I think are a good match for the requirements. I am sure to communicate this fact to the candidates. Vendors that have worked with me before know that if I contact them, they should jump right to the short list stage of their sales process when victory is in sight.
While the RFP contains the baseline, vendors should be invited to ask questions to help them produce more compelling and tailored proposals. Their ability to ask the right questions (and actually listen to the answers) is a differentiator. I want them to understand as much as they can about my client so they can put together a great proposal.
The required written response to an RFP should be lean. Less time invested in the written response means that the vendor can put more work into preparing a customized demo that will resonate with the audience. The goal is not for the client to lock themselves in a room and read proposals. You want to engage with the vendor. If you have been selective about who is participating, you can spend the time with each vendor that you are evaluating and get to know each other.
While the RFP itself is not dead, the old RFP process has certainly outlived its use. The RFP should not be a restrictive conduit for communication. It should be a starting point for a dialog. The RFP should not be an open call for the market to stand forward and identify itself. The issuer of an RFP should already understand the market and be selective as to who it invites.
Software selection should be an active process, not a passive one. It takes investment and education from both sides to meet in the middle.
Nov 19, 2007
Following in the vein of Tony Byrne's 10 Steps to a Successful Vendor Demo and my post on how to sit through one , here is a link to Joel Spolsky's How to Demo Software article. Besides being an entertaining read (as all Joel's posts are), I think this article re-enforces the importance of the performance and other production values to a vendor demo.
Sep 24, 2007
After you watch a couple of vendor demos, it doesn't take long to realize that the performance of the demo (how well the presenters know the product and how well they understand and connect with the audience) plays as much a part of the product impression as the quality and the capabilities of the product itself. Given that the sales team probably is not going to be around during your implementation or when your users first start using the system, this should scare you if you are basing your selection on the product demo. While it is important that a software vendor cares enough about your business to put some thought and effort into showing you the product, you also want to build your system on the the most suitable product. Here are some tips to manage vendor demonstrations that will isolate the important aspects of the vendor and the product and filter out the extraneous information that may confuse or distract you. For those software vendors out there, I hope that you read this and also Tony Byrne's advice. For people selling and evaluating open source, there are some slight nuances that I will mention at the end but probably cover more thoroughly in a different post.
A successful demo is all about preparation. You need to prepare the vendor (or systems integrator or in house staff if you are evaluating non-commercial software) with the information they need so they can do their best. You also need to prepare the audience on what they should be looking for.
Only do demos with a short list of vendors. Work with someone who knows the market to give you a short list of products to look at. That doesn't mean asking someone "what are the best CMS." If they know anything, they will tell you that it depends on your requirements. If they have an opinion. Well, it is just going to be that: an opinion. You need to focus on a short list for two reasons. First, if the vendor knows that he is in a field of 10 candidates, he is not going to invest as much in the demo. He will have a junior sales team give a generic demo. Second, when subjected sit through 10 demos, your staff will not invest as much of their attention in evaluating each product and they will start to muddle the products together.
Clearly define what you want the demo to show. Because content management systems (especially web content management systems) are so flexible, a demo should be a prototype that you define according to your requirements. Just like a prototype, you need to clearly specify what it needs to do. The approach that I find the most effective is using scenarios that describe tasks that need to be accomplished using the system. The demo should show how the user would accomplish that task using the product. The demo should also recognize the constraints introduced by your architecture. The vendor should not know you features that would not work in your architecture. Neither should they show features that you don't need. The demo should show your content. Ask the vendor to configure a content types that matches the the most complex content type in your content model.
Validate that the vendor understands your requirements. Have the vendor prepare a written response describing how their product can support your scenarios. Review it and give them feedback with ample time to adjust their demo in case they misunderstood what you need.
Prepare the audience. Prepare your audience for the demo by telling them what they should be looking for. A scorecard that lists the scenarios is useful for keeping people's attention on their needs, not gimicky features. If the audience does not understand basic content management theory (separation of content and layout, re-usability, content lifecycle, etc.) address that before the first demo. Vendors are actually pretty good at explaining that stuff but there are more effective uses of their time. Also, vendors tend to up their game when the realize they are dealing with a sophisticated audience.
During the demo
The demo should use everyone's time as effectively as possible and should be organized to ensure that vital information is communicated to the right people.
Limit company background information. The vendor should be able to introduce their company and make the case that it is a stable company, it gets content management, and knows your industry. However, you need to contain the amount of time that they take to do it. They should be able to build a level of credibility and comfort with the audience but not infringe on the time they have to talk about their product within your context. Hopefully your short-list exercise already pre-qualified the vendors along these lines.
Mind your manners. Even if your corporate culture thinks it is OK for staff to attend meetings in-body only, keep distractions to a minimum. Ask your audience to put aside their email, blackberries, and cell phones and pay attention. Give the vendors every opportunity to engage with the audience. If the vendor is missing the mark, don't tune out. Instead, help steer them back on course. If you can't do that, politely end the meeting as quickly as possible and be happy that you were able to eliminate an option in a very hard decision.
Mark your scorecards. Without making it feel like a Bingo hall, have the audience take notes in their scorecards so that they remember what they saw and their impressions. By the time they have gotten back to their desks and answered their first of fifty waiting emails, they will have forgotten half of what they saw.
Break up the meeting. A thorough demo is enough to tax anyone's endurance. Not everyone needs to hear everything and people tend to lose focus after sitting for long periods of time. I usually break up demos into three main sessions. The first is the company background and functional session that all the stakeholders should attend. This is when the vendor walks through the scenarios and helps business users visualize using the product to get their jobs done. The next session is the technical session that shows what is going on behind the scenes and how the system can be customized and integrated. All the business users that are still awake can leave for the technical part. If they are asleep, leave them alone and let them dream about life with better content management tools. They can use the rest. The third session is the project management and licensing part where the vendor talks about the licensing needs, cost, and professional services. Your project management people and tech leads need to be part of the discussion. Everyone else can go back and extinguish the fires that have probably ignited during their absence.
"Yes" is not a good enough answer. When you ask if the system can do something, don't let the vendor get away with a simple "yes." Have them show it. And if they are not prepared to show it, have them describe how it would work and how much effort it would take to get it to work like that. You could also ask to speak to other customers that are using the product in that way.
Don't let wait long to get feedback from the audience. It doesn't take long for people to forget. Follow up and plan the next steps as soon as possible.
The post mortem. As soon as possible, get everyone in a room and have them express their observations and impressions. Ask them what they didn't see. Hopefully, they have notes on their scorecards. There were probably some scenarios that were not adequately explained. Get this information so you can follow up with the vendor.
Schedule follow ups. Talk through what additional information is needed with each of the vendors who earne
d further consideration. For the vendors that didn't make the cut, explain why. If the demo was a disaster but you think the product still has potential, you could give them another chance or you could take it as a sign that they are not prepared to support you. Remember, after the contract is signed, things are only going to get worse.
Prototype. If there is a question about something, build a prototype and allow users to bang on it. Different vendors will have different policies around this. Some create hosted sand boxes and allow business users to experiment. Others provide trial versions of the software so that a customer can attend training and try to build the prototype themselves.
Demos can either clarify or confuse, inform or misinform. If run properly, they can be the most important part of the selection process. At the end of the day, both you and the vendors are after the same goals. They want customers that are successful with their software. You want to be a customer that is successful with their software. However, that doesn't mean that a sales team can't get swept up trying to win a deal. It also doesn't mean that business users will not lose sight their goals when distracted by flashy features and a compelling demo performance. Be up front with this and try to work together to achieve this goal.
What about open source software? For the commercial open source products out there, this advice still holds. You just want to be even more sensitive about using the vendors time efficiently because they have less to gain in terms of licensing revenues. Assign some of your technical staff to dig around the product (and the community) for themselves. If a commercial open source vendor is able to invest in large sales teams, you can be pretty sure they have a pricing model (around support and maintenance) where they can collect revenues that are equivalent to commercially licensed software. Either that or they haven't had to think about building a sustainable software business yet. For community-based projects, you are not going to get a sales team. You should form an internal team (or pay a systems integrator) to build the prototype and play the role of the sales engineer. You probably need to do more homework to decide what platform(s) you should evaluate and be even more diligent in documenting your requirements. Otherwise, your developers will get drawn to nifty architectures and technology buzzwords and neglect what your business users need.