BERT在NLP任务中效果十分优秀,这篇文章对于BERT在文本分类的应用上做了非常丰富的实验,介绍了一些调参以及改进的经验,进一步挖掘BERT的潜力。
Read本文主要讲述Self-Attention机制+Transformer模型。自己看过论文与其他人文章的总结,不是对论文的完整翻译。
ReadQANet: Combining Local Convolution With Global Self-Attention For Reading Comprehension
ReadUse Siteleaf CMS for your devlopr jekyll blog
ReadDeployment Guide for devlopr-jekyll blog using Github Pages and Travis CI
ReadGetting Started - How to build a blog using devlopr-jekyll and Github Pages
ReadHi, My Name is Hao Sun. I love machine learning and NLP.
Follow @sigmeta