In real projects...

ERP selection fails when teams treat demos as evidence. A great demo can still hide the workflows where your organization has real risk: close/reconciliation, exception handling, evidence archiving, and access governance.

A common issue we see...

Requirements are listed as features, not as workflows with evidence outcomes. That creates “checkbox” scoring and forces the team back to spreadsheets after go-live.

For example...

  1. Write requirements as workflow decisions (who approves, what evidence is produced, and what artifacts survive review).
  2. During demos, ask for the exception path: what happens when documents are missing or tolerances are exceeded.
  3. Score each vendor on evidence quality and traceability, not only UI polish.
  4. Confirm governance: roles, approvals, and audit logs that match your control model.
  5. Require proof for your “must not break” workflows using a structured test plan.

Note: These scenarios are representative and educational. Validate decision frameworks with your internal stakeholders and advisors.

Methodology: This is an educational guide using representative scenarios from public documentation and common implementation/audit patterns. It is not based on any one client’s confidential details.

ERP selection decisions made under time pressure—or driven primarily by vendor demonstrations—produce poor outcomes. This walkthrough structures the process so that requirements, not demos, drive the decision.

  1. Define the selection criteria in writing before speaking to any vendor: which processes are must-have, which are nice-to-have, and which are outside scope.
  2. Score each criterion by business impact and by risk—a missing feature in a high-impact, low-risk area is different from a gap in a regulated, high-risk process.
  3. Issue a structured information request to shortlisted vendors covering functional coverage, industry implementation references, total cost of ownership, and support model.
  4. Run scripted demonstrations using your own business scenarios—not vendor-prepared scripts—and score each vendor against the same criteria.
  5. Conduct reference checks with organisations of similar size and complexity in the same industry segment, and speak with the implementation project manager—not the account executive.
  6. Complete a total cost of ownership model covering implementation, licences, customisation, integration, training, and five-year support costs before the final decision.

Artifacts to expect:

  • Requirements matrix with weighting and scoring criteria.
  • Vendor response comparison against the requirements matrix.
  • Reference check notes from at least two industry peers per shortlisted vendor.
  • Total cost of ownership model for each shortlisted option.
  • Selection recommendation document with rationale and risk assessment.

What usually goes wrong (failure modes)

  • Selection is driven by the best demonstration rather than the best fit
    Vendor demonstrations optimise for impressive features, not for the specific requirements that matter to your organisation.
    Mitigation: Require vendors to demonstrate specific scenarios from your own operations, not their standard demo scripts. Score against a fixed criteria list rather than overall impression.
  • Implementation costs are significantly higher than the original estimate
    The initial cost estimate was based on a high-level scope that did not account for customisation, data migration, or integration complexity.
    Mitigation: Request a detailed implementation estimate based on a discovery workshop, not a sales conversation. Include a contingency of at least 20 percent for a typical mid-market implementation.
  • The system works for the pilot process but not for edge cases specific to your industry
    Industry-specific processes (such as job costing in construction or claims management in healthcare) were not tested in depth during selection.
    Mitigation: Identify your three highest-complexity processes and require a detailed working demonstration of each before the final shortlist is confirmed.

Controls and evidence checklist

  • Document selection criteria and weightings before any vendor contact.
  • Score each vendor against the same criteria using a consistent rubric.
  • Verify industry references independently—do not rely on vendor-supplied references alone.
  • Obtain a fixed-price or capped-cost implementation estimate, not a time-and-materials estimate.
  • Include post-go-live support terms and upgrade policy in the contract negotiation.

Implementation checklist

  1. Run an internal requirements workshop before issuing any vendor communication.
  2. Issue a structured information request to four to six vendors before conducting demonstrations.
  3. Shortlist to two or three vendors for deep-dive demonstrations and reference checks.
  4. Complete TCO modelling for each shortlisted option before final board or leadership approval.
  5. Negotiate contract terms including implementation milestones, support SLAs, and upgrade commitments.
  6. Communicate the decision internally and establish a steering committee before the implementation project begins.

Frequently asked questions

Where do teams usually lose time in ERP selection processes?

Most time is lost during vendor demonstrations when teams have not defined their must-have requirements in advance. Vendors fill unstructured time with polished feature tours that rarely surface the functionality your organisation actually needs. A requirement matrix—scored before demos begin—keeps evaluations focused and comparable. Teams that skip this step often restart the process after a poor initial selection.

What should we validate during vendor reference checks?

After a shortlist is selected, test the vendor's implementation track record in your specific industry segment, not just their general reference list. Ask for references from organisations of similar size and complexity, and speak directly with the project manager—not just the account executive—about what went wrong and how it was resolved. Industry-specific gaps that were not discovered until mid-implementation are the most common source of cost overruns.

When should we revise our evaluation criteria during the selection process?

Revisit your scoring criteria if new requirements emerge during deep-dive workshops that were not captured in the original matrix. It is better to delay the decision by two weeks to reassess than to proceed with a vendor whose product has a critical gap that only surfaced during detailed process mapping. Document every criteria change with a rationale so the decision process remains transparent and defensible.

Sources

Conclusion and next steps

ERP selection for your industry requires a requirements-led process, structured scoring, and reference validation—not a reaction to the most polished demonstration.

The four to six weeks invested in a structured selection process typically saves six to twelve months of post-implementation remediation for organisations that skip it.