Title and Copyright Information
Overview of Live Chat Sentiment Analysis Accelerator
Accelerator Reports
Audience
This documentation describes the features of Live Chat Sentiment Analysis Accelerator for B2C Service and provides the configuration steps required to implement it.
Software Requirements
This accelerator uses different softwares, cloud applications, and native applications to implement the end-to-end solution. To obtain the various licenses or environments for Oracle softwares, you must work with the applicable Oracle Applications team supporting your organization.
Third-party Dependencies
The following table lists the third-party software libraries and their versions.
User Flow
When an end user initiates a chat with an agent, the Live Chat Sentiment Analysis Accelerator flow is initiated. As of now this accelerator does not support the real-time live update, as it uses asynchronous Custom Process Model (CPM) updates for the prediction flow, which means that the user can expect a slight delay of a few seconds to a minute to see the prediction evaluation results to be reflected in the Monitoring Report for the supervisor.
Extension Flow
This flow captures “prediction of emotion” and “supervisor ask” for every new chat message.
CPM Flow
This flow explains the steps that occur when a row is inserted in ChatAIPredictionInfo, which invokes the Custom Process Model (CPM) to evaluate the prediction results saved for each chat message.
Feedback Flow
The feedback flow allows supervisors to correct any wrong predictions made by the model. This corrected data will be pulled by the feedback job running in the OCI Data Science. This job uses this delta with the current training data and trains a new model. This ensures continuous learning and improvement of the model.
Model Deployment and Training
OCI Deployment Architecture
The following diagram shows the network architecture of the OCI Language Service model deployed in customer tenancy. The data used to train the model is stored in object storage and the access to the object storage to the OCI Data Science job is regulated via resource principal. The custom model is deployed in private subnet of Language Service and the access to the model is exposed via OCI Language Model endpoints.
Create OCI Vault in root Compartment
Configure SSO for Site
You need RSA Certificate to configure the SSO user for the Ingestion job. This job pulls data from the B2C Service Cloud and stores data in the object storage for training ML model.
Enable SSO Login for a User
Create a Stack from Zip File
A Stack is a terraform configuration that you can use to provision and manage your OCI resources.
Create Custom Objects
Configure External Objects
You must use external objects to configure connection to OCI model endpoints. Separate connections are required for sentiment model and supervisor ask model.
Add Custom Configuration
Create a new configuration setting to create site specific custom setting for CPM configurations for invoking machine learning predictions.
Add Reports
Supervisors/Administrators can use reports to monitor the live chat sentiment and get the sentiment details of wrapped chats. Two report definitions are packaged as part of the sample code. You can create new reports using .net console.
Install CPM File
The CPM script allows you to consume machine learning predictions. You need to update the test contact Id in the CPM file.
Configure CPM Routing
Add Extensions
You can import Agent Browser UI Extensibility into Oracle B2C Service to create extensions using the Extension Manager.
Import Workspace
Assign Workspace to Staff Profile
You need to assign the workspace for the ChatAIPredictionInfo custom object for the admin user who will be reviewing the predicted values and giving the feedback.
Add DLM Policy to Purge Old Data
Object Designer Data Model Changes
Debug CPM Logs
You can use the Process Designer / Custom Process Models (CPM) to associate PHP scripts with object events. We use CPM to call the CX REST APIs to get the chat messages and evaluate messages to find if it’s a negative sentiment chat.
Debug Data Science Job Logs
Debug Extension Logs
Extension logs have the extension invocation logs with any failure in handler, if present, in the Message column.
Debug Terraform Deployment Logs
Terraform module for OCI Logging is used to create logs and log groups for OCI services and custom logs. This log will help to identify and debug the failures or errors that occur during the terraform run.
Integrated Cloud Applications & Platform and Services