The Real Story Group strongly advocates a test-based approach to procurement, based on the value of scenarios; nevertheless, many enterprises want to apply a quantitative, spreadsheet-based assessment approach. In some cases these can become really quite complex. We have advised on dozens of scoring spreadsheets, and like to think we have seen them all.
And over the years, we've learned a hard truth: more details and more complexity in a scoring methodology may not deliver you the right vendor. It can occasionally do the exact opposite.Ideally you would dispense with spreadsheet scoring entirely, in favor of a more practical, facilitated decision-making process. However, the latter is not a universal panacea and needs to be carefully managed. I have seen qualitative approaches become too loose and subjective, and fail to achieve the best-fit solution.
In some cases you may have no choice but to formally justify your decisions with a quantitative rationale. The level of detail and particular approach to this sort of thing can vary. For example the public sector or highly-regulated industries might require more detailed scoring of shortlisted vendor RFP responses due to legal requirements, or quite simply the need to CYA.
If you have a 17-sheet spreadsheet with 238 lines of requirements, grouped into logical categories and scientifically weighted, you might feel you are doing a thorough job. But your scoring can only be based on what a vendor has supplied you in their RFP response document. And vendors are experts are filing RFP response documents.
Thus, you might be undertaking a detailed review, but not necessarily an accurate one. Some vendors outright lie (I have caught some out on occasion myself), many if not most vendors massage the truth a bit (or a lot). Some vendors are honest, and as a result are penalized heavily, leading to the victory of a rival vendor poorly qualified for the work, who did a great job of creative writing on their RFP response.
One of my personal favorites was a vendor who had scored top (by quite a margin) in a very big records management RFP shortlist. The buyer -- a research customer of ours -- grew a little suspicious and asked me to independently score the responses using their methodology. I scored the same vendor the lowest. Why? Well, simply because when the vendor answered "Yes, with scripting" I took that to mean "No, we cannot," whereas internal scorers scored those responses positively.
This gets to the other big problem with quantitative scoring: it tends to favor questions and responses about what a product can do (or what the vendor says it can do) rather than what almost always constitutes the real discriminator: how a product works.
In the end, if you are stuck with a heavily quantitative selection process, make sure at least that you don't short-cut the rest of your diligence. Make sure you speak directly to vendor references, and bring the vendor in to your premises to demonstrate the shipping product. Just remember you can encounter equally as much slight of hand when it comes to demonstrating products and providing "references" as vendors exhibited playing the detailed RFP game.
Getting the balance right between the right functional versus the right technical fit is hard enough. Finding a vendor you really want to work with, and that really wants to work with you, can be even harder. Take your time and do your homework.Over the years, I've learned a hard truth: more details and more complexity in a scoring methodology may not deliver you the right vendor. It can occasionally do the exact opposite. Ideally you should dispense with spreadsheet scoring entirely in favor of a more practical, facilitated decision-making process.