Go to file
2023-02-12 19:35:22 +01:00
models/t5_v1_1 T5 2023-02-12 15:22:23 +01:00
gpt2.py GPT2 head WIP 2023-02-12 19:23:03 +01:00
GPT_2.ipynb GPT-2 2023-02-10 22:27:06 +01:00
README.md readme 2023-02-12 14:52:18 +01:00
ROBERTA.ipynb Init commit + roberta 2023-02-08 21:45:52 +01:00
roberta.py roberta custom head test3 2023-02-12 19:35:22 +01:00
run_glue.py roberta custom head test2 2023-02-12 19:31:34 +01:00
run_translation_freezing.py fix 2023-02-12 14:54:19 +01:00
run_translation.py freezing test 2023-02-12 14:27:00 +01:00
T5.ipynb T5 2023-02-12 15:22:23 +01:00

Transformer Encoder - RoBERTa

Modyfikacje

  1. ????

Transformer Decoder - GPT-2

Modyfikacje

  1. ????

Transformer Encoder-Decoder - T5

Modyfikacje

  1. Zamrożenie pierwszych 20 warstw

Transformer w trybie few-shot/zero-shot learning - ?????