Skip to main content

Trackers and Analytics UI


The Trackers and Analytics UI provides a waveform visualization with conversation insights. The waveform highlights Topics in the timeline using color coded timestamps allowing you to get a snapshot of when they occured in the course of the conversation. You can view Trackers with sentiment score, transcripts, speaker information, and other conversation insights described below.

In this version, the Tracker and Analytics UI is supported for audio conversations.

Waveform

Description
1. Waveform TimelineThe waveform timeline consists of color coded timestamps to show when exactly a Topic was discussed in the conversation.
2. Topics with Sentiment ScoreHover your cursor around the Topics to get the Sentiment Score applicable to that Topic. The Sentiment Score can tell you if the Topics discussed were positive or negative in nature. Read more in the Sentiment Polarity section.

Waveform
3. TrackersYou can view the Trackers identified in the course of the conversation. It provides details on how many times the Trackers occured and who said it.

Waveform
4. AnalyticsProvides an overview of speaker talk and silence ratios and words per minute.

Waveform
5. TranscriptTranscript of the conversation with Speaker separation if the Speaker Diarization is enabled. See Best Practices information below.
6. Speaker AnalyticsA timeline showing speakers talk time along with timestamps of when and who asked questions.

Waveform
Best Practices

In order to get a full-fledged version of the Trackers and Analytics UI, ensure that you have:

  1. A pre-configured set of Trackers: Ensure that you have created Trackers, and processed it with Symbl (i.e., "conversationId" is generated). This happens when you submit the conversation data to the Async API for processing. Read about the step-by-step instructions here.

  2. Enabled Speaker Separation: Ensure that the Speaker Separation step is also enabled when submitting data to the Async API. ā€œenableSpeakerDiarization=trueā€ and ā€œdiarizationSpeakerCount={number}ā€ should be passed in the query parameter. Read more in the Speaker Separation page. If these optional parameters are set, the Speaker Analytics component can generate high-resolution information.

Please note that once the raw conversation data is processed by Symbl (i.e., "conversationId" is generated), there is no way of retroactively adding trackers or enabling speaker separation. In this case, you have to submit the conversation data once more to the Async API with the optional parameters.

Generating Trackers and Analytics UI#

To generate the Trackers and Analytics UI, follow the steps given below:

1. Send a POST request to Async Audio API#


Process your audio file with Symbl by sending a POST request to the Async Audio URL API. This returns a conversationId.

If you have already processed your audio file and have the conversationId, skip to Step 2.

POST https://api.symbl.ai/v1/process/audio/url

Sample Request#

curl --location --request POST "https://api.symbl.ai/v1/process/audio/url" \
--header 'Content-Type: application/json' \
--header "Authorization: Bearer $AUTH_TOKEN" \
--data-raw '{
"url": "https://storage.googleapis.com/rammer-transcription-bucket/small.mp3",
"name": "Business Meeting",
"confidenceThreshold": 0.6,
}'

The url is a mandatory parameter to be sent in the request body and must be a publicly accessible.

For more sample requests, see detailed documentation for Async Audio API URL.

Sample Response#

{
"conversationId": "5815170693595136",
"jobId": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d"
}

2. Enable CORS (for files hosted on Amazon S3)#


CORS (Cross-Origin-Resource-Sharing) is required for files hosted on Amazon S3.

Why do I need to enable CORS?
The Trackers and Analytics UI has a visual component that renders waveform visuals based on the audio resource in the URL. To generate such visuals, the browser requires read-access to the audio frequency data, for which CORS configurations need to be enabled. Modern browsers by default, prevent reads to audio frequency through CORS.

If your audio file is not on Amazon S3, skip to the next step.

To enable CORS for Amazon S3 Bucket,

  1. Go to Amazon S3 Console (https://s3.console.aws.amazon.com/).

  2. Select the Bucket where the audio file is hosted.

Waveform

  1. Go to the Permissions tab.

Waveform

  1. Scroll down to the Cross-Origin resource sharing (CORS) section.

Waveform

  1. Edit the JSON to enable CORS for the Symbl URL.

Waveform

[
{
"AllowedHeaders": [
"*"
],
"AllowedMethods": [
"GET",
"HEAD"
],
"AllowedOrigins": [
"*"
],
"ExposeHeaders": [],
"MaxAgeSeconds": 3000
}
]

3. Send POST request to Experience API#


Using the conversationId from Step 1, send a POST request to Experience API:

POST https://api.symbl.ai/v1/conversations/{conversationId}/experiences

Request Body#

curl --location --request POST "https://api.symbl.ai/v1/conversations/$CONVERSATION_ID/experiences" \
--header 'Content-Type: application/json' \
--header "Authorization: Bearer $AUTH_TOKEN" \
--data-raw '{
"name": "audio-summary",
"audioUrl": "https://storage.googleapis.com/rammer-transcription-bucket/small.mp3",
}'

Request Body Params#

FieldRequiredTypeDescription
nameMandatoryStringaudio-summary
audioUrlMandatoryStringThe audioUrl must match the conversationId. In other words, the audioUrl needs to be the same URL that was submitted to the Async API to generate the conversationId.
summaryURLExpiresInMandatoryNumberThis sets the expiry time for the summary URL. It is interpreted in seconds. If the value 0 is passed the URL will never expire. Default time for a URL to expire is 2592000 which is 30 days.
caution

disableSummaryURLAuthentication is not supported as we accept only secure URL generation to comply with the mandatory security requirements.

Response Body#

{
"name": "video-summary",
"url": "https://meetinginsights.symbl.ai/meeting/#/eyJzZXNzaW9uSWQiOiI1ODU5NjczMDg1MzEzMDI0IiwidmlkZW9VcmwiOiJodHRwczovL3N0b3JhZ2UuZ29vZ2xlYXBpcy5jb20vcmFtbWVyLXRyYW5zY3JpcHRpb24tYnVja2V0L3NtYWxsLm1wNCJ9?showVideoSummary=true"
}

The url returned in the response body can then be opened in the browser to view the Trackers and Analytics UI.