BERT全称Bidirectional Encoder Representations from Transformers,是预训练语言表示的方法,可以在大型文本语料库(如维基百科)上训练通用的“语言理解”模型,然后将该模型用于下游NLP任务,比如机器翻译、问答。项目地址:https://github.com/google-research/bert#fine-tuning-with-bert
android-viewpager-transformers A collection of view pager transformers. This repos is fork from daimajia but i uploaded it to maven central and added some more javadoc. Download dependencies { compile com.eftimoff:android-viewpager-transformers:1.0