Talk: Optimizing Pre-Trained Transformers in Conversational AI for Faster Inference, Better Accuracy
10 - 10:30am
As competition increases, brands are evolving to make customer service a critical differentiator. The rule of the day is to provide faster and smarter contact center experiences. Delivering this level of service starts with having well-trained agents that know your product, but increasingly AI can be used to boost these agents – making them even more effective.
At the same time, the emergence of transformers has seeded this AI opportunity, increasing the potential for conversational AI. And libraries of pre-trained transformers like HuggingFace promise to democratize access to these models for these use cases. But at the same time, these pre-trained transformers have their limitations, especially when applied to real-world use cases. So how can modeling teams fine-tune pre-trained transformers to make them more impactful on their real-world modeling problems?
In this talk, Mindtree deep learning architects Bhanu Prakash and Sulata Patra will discuss techniques to refine pre-trained transformers from the HuggingFace library in an attempt to boost accuracy and accelerate inference time. They’ll focus their talk on automating conversation summarization through the development of a custom version of the BART summarization model from the HuggingFace library. In the course of this discussion, they’ll share technical details on how they implemented solutions like SigOpt to enable this refinement through a guided experimentation and hyperparameter optimization process. And they’ll provide real-world examples for how pre-trained BART performance compares to refined BART – and how this makes a big difference for their business.
Use SigOpt free. Sign up today.