Clear, simple and useful NLP blogs.
By someone who loves NLP, writing and teaching.
Customize transformer models to your domain
Transformers like BERT are pre-trained on a wide range of corpora, but can they achieve similar performance feats on specific domain related tasks? Learn how to customize transformers like BERT, RoBERTa to adapt to specific domains like healthcare, finance.
BERT-ology at 100 kmph
Among different BERT models developed by Machine Learning researchers, many of them have used interesting ideas that tackle problems common to a lot of Machine Learning, NLP tasks, and not just transformers. It is important to understand how and why did they work!
If you wanted to appreciate someone today, now is the time.
Sign up for my newsletter and you’ll never miss a post.