Go to file
Andrzej Preibisz a5fb14928c GPT-2
2023-02-12 20:03:40 +01:00
models GPT-2 2023-02-12 20:03:40 +01:00
gpt2.py GPT2 head 2023-02-12 19:33:45 +01:00
GPT_2.ipynb GPT-2 2023-02-12 20:03:40 +01:00
README.md readme 2023-02-12 14:52:18 +01:00
ROBERTA.ipynb Init commit + roberta 2023-02-08 21:45:52 +01:00
roberta.py roberta custom head test2 2023-02-12 19:31:34 +01:00
run_glue.py roberta custom head test2 2023-02-12 19:31:34 +01:00
run_translation_freezing.py fix 2023-02-12 14:54:19 +01:00
run_translation.py freezing test 2023-02-12 14:27:00 +01:00
T5.ipynb T5 2023-02-12 15:22:23 +01:00

Transformer Encoder - RoBERTa

Modyfikacje

  1. ????

Transformer Decoder - GPT-2

Modyfikacje

  1. ????

Transformer Encoder-Decoder - T5

Modyfikacje

  1. Zamrożenie pierwszych 20 warstw

Transformer w trybie few-shot/zero-shot learning - ?????