NLP for Superior User Interaction

$130M+In venture capital
3,000+Customers globally
$10B+In managed outcomes

Problem statement

  • Quantive (Gtmhub) struggled to consistently provide customized recommendations for their customers. The self-service nature of creating OKRs often became cumbersome, and navigating to the right Insight was increasingly difficult. 
  • Needed a solution to help users find useful Insights quickly and auto-fill fields to boost customer satisfaction.

Approach and solution

  • Implemented LSTM-based neural network for automatic form fill-ins using TensorFlow.
  • Deployed a pre-trained Transformer model to provide real-time recommendations for Insights, facilitated by microservices fetching events from a Kafka data stream.
  • Established ML pipelines using TensorFlow, Jenkins, and MLFlow to consume new datasets from Kafka, triggering model re-training for ongoing accuracy and relevance.

Impact achieved

  • Achieved over 95% accuracy in automatic entity recognition and form completion.
  • Significantly increased customer engagement.
  • Boosted the usage of OKRs and Insights by customers.

Expertise and scope

  • Technology Stack: Python, TensorFlow, Keras, MLflow, Docker, Jenkins, SonarQube, Azure Data Lake, Azure Data Factory, Azure Synapse, PostgreSQL

Overview

Quantive (previously Gtmhub) is a leading strategy execution software and services company inspired by the objectives and key results (OKRs) methodology. Their platform supports organizational alignment, improved visibility, and the creation of a result-driven culture.

Challenges

Rising digitalization with a relentless focus on operational excellence is driving product companies to innovate at higher speeds. In a modern/niche market, consumers often opt for data-driven products enabling intelligent, and seamless digital experiences.

To turn their product vision into reality, our client needed help with data-driven enhancements that make the most of customer insights and place them ahead of the competition in a multimillion-dollar market category of their own.

Solution 

Our transformation journey started by performing agile analytical work in various segments of customer and product data to grasp key concepts and gain knowledge about the product.

For the analytical workloads, we developed a PostgreSQL database that we connected to an automated data pipeline within Azure Cloud, thus leveraging Azure Data Lake and Data Factory to source data from MongoDB. As development efforts progressed, we enabled Azure Synapse to handle data orchestration and supersede Azure Data Factory.

Once enough insights were gained, we proposed to develop a Long short-term memory (LSTM) deep learning model based on TensorFlow and integrate it as a core natural language processing (NLP) feature in the product. 

Our new feature provides real-time semantic suggestions inferred from customer-supplied text fields. Once the user writes out the text, this information is sent over to the model API and predictions about the form fields are returned to the front end. As product architecture is based on microservices, embedding the model service within this context was done using gRPC.

By leveraging technologies such as Docker, MLflow, Jenkins, and SonarQube, we achieved complete automation of the data science and machine learning operationalization workflow for seamless model training, testing, evaluation, and deployment. As part of the ongoing monitoring setup, we developed Grafana dashboards with KPIs and metrics to monitor the real-time accuracy and performance of the service in production.

​​Results

Our model service exists in a live Kubernetes cluster serving thousands of customer requests daily. Throughout vigorous code optimization and testing, we achieved mean response time of 150 milliseconds per request and model accuracy of 92%. We continue to bring value to the customer by researching, building, testing, and delivering new, data-driven product features.