Physics master’s graduate, machine learning engineer and total word nerd.
Transformers: how machines have learned context
Keywords have been timelessly important in natural language processing. However, over the past 10 years we have made vast improvements in the field by leveraging not just the words, but the context in which they are found. Subsequently, there has been a significant shift in the way machines understand the underlying meaning in language. This talk will cover the history of approaches to context encoding, with a particular highlight on the most recent innovation: the transformer. We showcase how this state-of-the-art technique allows us at Chattermill to gain insight from customer feedback and uncover what decisions will drive the greatest impact.