GPT
0 订阅
GPT is a Transformer-based architecture and training procedure for natural language processing tasks. Training follows a two-stage procedure. First, a language modeling objective is used on the unlabeled data to learn the initial parameters of a neural network model. Subsequently, these parameters are adapted to a target task using the corresponding supervised objective.
相关学科: GPT-2ELMoGPT-3BERTXLNetRoBERTaT5GPT-NeoULMFiTLabel Smoothing
学科讨论

暂无讨论内容,你可以
推荐文献
按被引用数
学科管理组
暂无学科课代表,你可以申请成为课代表
重要学者
Ilya Sutskever
165856 被引用,113
篇论文
Christopher D. Manning
123173 被引用,515
篇论文
Alexander J. Smola
89395 被引用,459
篇论文
Elhanan Helpman
69817 被引用,430
篇论文
Alexey Svyatkovskiy
56905 被引用,739
篇论文
Pieter Abbeel
52831 被引用,627
篇论文
Kazuaki Chayama
48707 被引用,1783
篇论文
Edgar Erdfelder
46910 被引用,178
篇论文
John Shawe-Taylor
41627 被引用,580
篇论文
Bronwyn H. Hall
38558 被引用,378
篇论文