How quickly a support agent can retrieve the history and context of a customer when they report an issue directly impacts the customer experience. In addition, how quickly a knowledge base can be updated with common issues along with sharing current incidents in progress with support agents and customers can differentiate a great customer service center from a good one.
The use of LLM and Stream processing to solve these problems is a game changer in dynamically assessing incidents and significantly lower mean time to resolution. The following is a case study of how Timeplus is helping a leading Contact Center Software provider do just that. The following describes how we construct a 360 degree customer view based on customer interaction events (when they joined, what products they use, when they last reported an issue), ensure all support ticket conversations are categorized via AI in real-time, and create a rich context to pass onto a conversational AI to make the process much smoother for a support resolution to be reached.
For any organization running a contact center, it's common to rely on a range of IT systems—such as CRM, marketing, payment processing, and documentation tools—to manage various functions. When a customer calls, support engineers need to access as much relevant background information as possible. However, when this data is scattered across multiple databases, data warehouses, and SaaS applications, along other systems, it makes stitching together the right information that much harder. Retrieving information from these disparate sources can be slow and cumbersome when done at the point of need—some systems return data within seconds, while others only provide bulk updates on an hourly or daily basis.
Some of these systems are not designed for low-latency interactive queries, and with point-to-point APIs, it’s difficult to achieve a unified, comprehensive view of the caller’s information. As a result, support agents often have to work with an incomplete picture and manually put together the context, leading to the frustrating experiences we’re all too familiar with.
This is where Timeplus comes in. By integrating source system data with streaming platforms such as Apache Kafka, Confluent or Redpanda, or alternatively Timeplus’ high-performance streaming ingestion API, one can seamlessly correlate the data from these disparate systems in real-time as the data is produced. By tapping into real-time information feeds as data changes, Timeplus allows you to construct a unified, easily accessible view of each customer as things change for quick access later on. This key feature of being able to query a unified at low latency ensures that support engineers have the most up-to-date and comprehensive information at their fingertips when the incident is reported.
The streams of source data typically contain raw information, and require refinement of the data into a more refined, user-friendly view. Timeplus offers robust stream processing capabilities that allow you to transform, filter, and aggregate individual streams, creating a view that’s tailored to various access patterns such as HTTP APIs or database drivers such as JDBC or ODBC. Additionally, Timeplus provides both row-based and column-based historical storage, enabling fast aggregation queries, key/value lookups, updates, and vector similarity searches. Unlike other traditional stream processors, Timeplus eliminates the need to send data back to a relational database, document store, or data lake for querying. Instead, Timeplus serves as a unified analytics platform for data consolidation, filtering, aggregation, and querying.
With the customer 360 data served as a unified view, the next step is to programmatically link that information to each customer interaction. The following diagram shows an architecture that supports the sourcing, enrichment, and categorization of data with a vector database to create a dynamically up to date customer 360 view.
In Context Learning (ICL) App Flows
Once the customer 360 degree view is available, this can be leveraged alongside a conversational AI by the support agent to inject the customer information to the generic LLM to provide a much more fine tuned answer using all the available information.
Real-Time Context Data Provider for LLM
Streaming ETL Pipeline to Provide Real-Time Context
When a customer contacts the call center, Timeplus can be queried with their ID, email, or phone number to build a real-time context that gathers all relevant information. This context isn’t just comprehensive—it’s always fresh, incorporating data as current as the web pages the customer is visiting right now or recent questions they've asked on user communities or social media.
Here is an example:
You are Contact Center AI, assisting a support engineer in providing professional customer support for a caller named “Guest.” The caller may inquire about account creation, recent payments, or share product feedback. Please begin by greeting the caller warmly and professionally, then request any necessary identification details (e.g., account number, email) to locate their account.
The following information may be relevant to this inquiry:
- Account Creation: Guest created an account on [July 15, 2024] via the website using the email [guest@example.com]. The account was verified through an email confirmation sent at [10:15 AM].
- Recent Payment: Guest made a payment of $45.99 on [August 1, 2024] via credit card. The transaction ID is [TXN123456789].
- Social Media Activity: On [August 2, 2024], Guest posted on Twitter expressing satisfaction with the recent service received. The post garnered 25 likes and 5 replies, including responses from the community and the company’s official account.
- Web Analytics: Today, Guest visited 18 pages on docs.acme.com, focusing on setting up SAML Authentication.
- 2 seconds ago: Visited docs.acme.com/get-help
- 10:10-10:12 AM: Visited docs.acme.com/security/part1 and docs.acme.com/security/part2
- 10:03 AM: Visited docs.acme.com/security/overview
- 09:12 AM: Visited docs.acme.com/get-started
With this up-to-date context, Timeplus generates a prompt that enables an LLM to summarize the information and provide actionable insights to the support agent, ensuring they have the most relevant details at their fingertips.
What’s Next
The above architectures showcase Timeplus’s powerful capabilities in Streaming ETL, CDC, JOIN, and both historical and streaming queries. However, this is only the beginning of our strategic partnership with the leading contact center solution provider. Looking ahead, we plan to leverage streaming-first machine learning to continually monitor recent customer inquiries, identify patterns, and generate optimal responses. Additionally, this approach will enable us to provide more accurate predictions for outage maintenance and software updates, further enhancing the efficiency and effectiveness of customer support.
Timeplus as Real-Time ML Feature Platform
Ready to try Timeplus Enterprise? Try for free for 30-days.
Join our Timeplus Community! Connect with other users or get support in our Slack community.