How can we help?

Calibrations in QM


With Talkdesk Quality Management™ (QM) “Calibrations” feature, Administrators can ensure scoring consistency and objectivity among evaluators by assigning a call to be evaluated by a group of evaluators, and later on, comparing the results of those evaluations. 

There can be multiple calibration-type evaluations tied to a single recording. These calibration-type evaluations tied to the same recorded call can be compared, either side by side in the UI and/or in “Reporting”, to easily identify possible discrepancies (e.g. which specific questions were scored differently).

Note: If you wish to install the QM app, please contact your Customer Success Manager.


Accessing Calibration Requests

As an Admin, you can follow the steps below to access the “Calibrations” page that shows the comparison between the different calibration evaluations:


1. Once you have logged in as an Administrator, go to the Calibrations menu  [1].

2. On the “Calibrations” page [2] you are able to view the total number of calibrations you have made, as well as the “Calibration name” [3], “Evaluators” name [4], date, and hour of creation [5], “Due Date” [6], “Score” (percentage) [7] and the “Progress” of each calibration request [8]


When you select the “Calibration name” you will be taken to the corresponding “Call recording” page where it is possible to Play the recording [9] or click on the Show screen button to watch the recording video [10]. On this page, the option to open the video recording in a new tab is disabled  [11] and it is not possible to add new comments during the call. 


Requesting a Calibration

To learn how to request  a call calibration, please follow the steps below:


1. Click on the Calls tab [1].

2. Select the Play button next to the call you wish to evaluate [2].


3. Then, click on the Advanced Player button [3]. This action will take you to the “Evaluations” menu within the Quality Management app.


4. Select the Request Calibration button [4]. When requesting a calibration, the “To do” status evaluators receive on their “Evaluations” list works the same way as the others they might already have.


5. On the “Request Calibration” pop-up window  [5], choose the specifications of the calibration session you wish to create:

  • Insert a “Name” for the calibration request [6].
  • The “Agent” name (filled by default) [7].
  • Choose an “Evaluation form” template [8] from the drop-down list.
  • Search for the “Evaluators” name  [9] from the drop-down list. 
  • Pick the “Due date” for the request [10] from the calendar icon. 
  • Select the minimum percentage of evaluations accepted for the calibration to be considered complete [11]. Note: In case the value selected is not reached by the due date picked, the calibration will be considered incomplete and the option to extend it will be available.

6. Once you’re finished filling all the previous mandatory fields, press Continue [12]. Once the request is sent, the evaluators will be able to evaluate it, and they will receive an evaluation in  “To do” status on their “Evaluations” list. 


  • The completed evaluation and score generated as a result of the calibration will not be considered as an actual score toward performance metrics and reporting.
  • Calibration evaluations are not shown to agents and can be compared to identify discrepancies, and they can only be deleted if not yet completed. On the “Calibrations” list page, they can be deleted only in case calibration evaluations are on the “Requested” or “Expired” statuses. 
  • The calibration evaluations will have the status “To do” in the “Evaluations” menu, even if there’s no active  “Random Sampling” feature.


 Editing Calibration Requests

After creating a calibration request, you can still update some specifications. To do so, follow these steps:


1. Go to the Calibrations menu  [1].

2. Click on the calibration name you wish to edit [2]. 


3. Click on Edit [3]. 


4. On the “Edit calibration” pop-up window [4], choose the following specifications that you can still change: 

  • The “Name” [5].
  • The “Evaluators” [6].
  • The “Due date” [7].
  • The “Minimum acceptable to complete” [8]

5. Click on the Update button to save your changes  [9]. Once you send the update, it will be reflected on the request evaluators received.


Comparing Calibration Results


When there are evaluations in the “Completed” status [1], it will become possible to compare the results. Follow the instructions below to learn how to compare them:

1. Click on the Compare results button  [2].   


2. The “Call recording” section is hidden by default  [3]. However, you can select the arrow to expand and see the recording details.

3. On “General score” [4] you have all the names of the evaluators involved in the calibration [5], the score they gave to the agents under “Points” [6], as well as the “Score” in percentage format [7].

4. On the “Evaluation results”  [8] you have the form and all the questions within each section. The questions can have the following icons next to them:

green.png :  There are no discrepancies in the answers given to the question.

yellow.png :  There are discrepancies in one or more answers given by the evaluators to the question.

grey.png :  It is a question with an answer in free format text, and therefore it is not possible to compare (but you can view what was written).



To know more about the Calibrations permissions,  you can read the “Defining and Editing Permissions for QM ” section of this article.

All Articles ""
Please sign in to submit a request.