How to Evaluate ASR Solutions
Identifying requirements and testing vendor solutions is already a lot of work, but it’s harder when you aren’t sure where to start, or worse, when vendors try to win your business using vanity metrics. When evaluating ASR or automated speech recognition providers, there are considerations around accuracy, speed, cost, scale and usability that all carry different weights depending on your use case. Then, making sense of the metrics that come from testing in relation to those criteria is another big piece of homework. In this webinar, we aim to simplify the evaluation process with example requirements, criteria and metrics and share a companion guide for you to take into your next ASR eval process.
In this on-demand webinar, you will learn...
- How to build criteria that fit your use case
- What to expect from benchmark or test outputs
- What key metrics like WER (word error rate), WRR (Word Recognition Rate), amWER (alphanumeric word error rate), or Effective Vocabulary really mean and how they’re calculated
Sign up to access the webinar on-demand!
Press Play
MEET THE SPEAKER
Sam Goldfield
Solutions Engineer
Deepgram