projekt-glebokie/FLAN_T5.ipynb
2023-02-12 23:22:40 +01:00

403 KiB
Raw Blame History

Setup

Requirements

!pip install torch
!pip install datasets
!pip install transformers
!pip install scikit-learn
!pip install evaluate
!pip install accelerate
!pip install sentencepiece
!pip install protobuf
!pip install sacrebleu
!pip install py7zr
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Requirement already satisfied: torch in /usr/local/lib/python3.8/dist-packages (1.13.1+cu116)
Requirement already satisfied: typing-extensions in /usr/local/lib/python3.8/dist-packages (from torch) (4.4.0)
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting datasets
  Downloading datasets-2.9.0-py3-none-any.whl (462 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 462.8/462.8 KB 6.4 MB/s eta 0:00:00
[?25hRequirement already satisfied: pyarrow>=6.0.0 in /usr/local/lib/python3.8/dist-packages (from datasets) (9.0.0)
Collecting multiprocess
  Downloading multiprocess-0.70.14-py38-none-any.whl (132 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 132.0/132.0 KB 6.6 MB/s eta 0:00:00
[?25hRequirement already satisfied: dill<0.3.7 in /usr/local/lib/python3.8/dist-packages (from datasets) (0.3.6)
Requirement already satisfied: numpy>=1.17 in /usr/local/lib/python3.8/dist-packages (from datasets) (1.21.6)
Collecting xxhash
  Downloading xxhash-3.2.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (213 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 213.0/213.0 KB 7.1 MB/s eta 0:00:00
[?25hRequirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.8/dist-packages (from datasets) (6.0)
Requirement already satisfied: fsspec[http]>=2021.11.1 in /usr/local/lib/python3.8/dist-packages (from datasets) (2023.1.0)
Requirement already satisfied: aiohttp in /usr/local/lib/python3.8/dist-packages (from datasets) (3.8.3)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from datasets) (23.0)
Requirement already satisfied: requests>=2.19.0 in /usr/local/lib/python3.8/dist-packages (from datasets) (2.25.1)
Requirement already satisfied: tqdm>=4.62.1 in /usr/local/lib/python3.8/dist-packages (from datasets) (4.64.1)
Requirement already satisfied: pandas in /usr/local/lib/python3.8/dist-packages (from datasets) (1.3.5)
Collecting responses<0.19
  Downloading responses-0.18.0-py3-none-any.whl (38 kB)
Collecting huggingface-hub<1.0.0,>=0.2.0
  Downloading huggingface_hub-0.12.0-py3-none-any.whl (190 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 190.3/190.3 KB 4.1 MB/s eta 0:00:00
[?25hRequirement already satisfied: async-timeout<5.0,>=4.0.0a3 in /usr/local/lib/python3.8/dist-packages (from aiohttp->datasets) (4.0.2)
Requirement already satisfied: yarl<2.0,>=1.0 in /usr/local/lib/python3.8/dist-packages (from aiohttp->datasets) (1.8.2)
Requirement already satisfied: charset-normalizer<3.0,>=2.0 in /usr/local/lib/python3.8/dist-packages (from aiohttp->datasets) (2.1.1)
Requirement already satisfied: aiosignal>=1.1.2 in /usr/local/lib/python3.8/dist-packages (from aiohttp->datasets) (1.3.1)
Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/lib/python3.8/dist-packages (from aiohttp->datasets) (6.0.4)
Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from aiohttp->datasets) (1.3.3)
Requirement already satisfied: attrs>=17.3.0 in /usr/local/lib/python3.8/dist-packages (from aiohttp->datasets) (22.2.0)
Requirement already satisfied: filelock in /usr/local/lib/python3.8/dist-packages (from huggingface-hub<1.0.0,>=0.2.0->datasets) (3.9.0)
Requirement already satisfied: typing-extensions>=3.7.4.3 in /usr/local/lib/python3.8/dist-packages (from huggingface-hub<1.0.0,>=0.2.0->datasets) (4.4.0)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.8/dist-packages (from requests>=2.19.0->datasets) (1.24.3)
Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.8/dist-packages (from requests>=2.19.0->datasets) (2.10)
Requirement already satisfied: chardet<5,>=3.0.2 in /usr/local/lib/python3.8/dist-packages (from requests>=2.19.0->datasets) (4.0.0)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.8/dist-packages (from requests>=2.19.0->datasets) (2022.12.7)
Collecting urllib3<1.27,>=1.21.1
  Downloading urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 140.6/140.6 KB 4.7 MB/s eta 0:00:00
[?25hRequirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas->datasets) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas->datasets) (2022.7.1)
Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.8/dist-packages (from python-dateutil>=2.7.3->pandas->datasets) (1.15.0)
Installing collected packages: xxhash, urllib3, multiprocess, responses, huggingface-hub, datasets
  Attempting uninstall: urllib3
    Found existing installation: urllib3 1.24.3
    Uninstalling urllib3-1.24.3:
      Successfully uninstalled urllib3-1.24.3
Successfully installed datasets-2.9.0 huggingface-hub-0.12.0 multiprocess-0.70.14 responses-0.18.0 urllib3-1.26.14 xxhash-3.2.0
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting transformers
  Downloading transformers-4.26.1-py3-none-any.whl (6.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.3/6.3 MB 25.5 MB/s eta 0:00:00
[?25hRequirement already satisfied: requests in /usr/local/lib/python3.8/dist-packages (from transformers) (2.25.1)
Requirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.8/dist-packages (from transformers) (6.0)
Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.8/dist-packages (from transformers) (2022.6.2)
Requirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.8/dist-packages (from transformers) (4.64.1)
Requirement already satisfied: numpy>=1.17 in /usr/local/lib/python3.8/dist-packages (from transformers) (1.21.6)
Requirement already satisfied: huggingface-hub<1.0,>=0.11.0 in /usr/local/lib/python3.8/dist-packages (from transformers) (0.12.0)
Requirement already satisfied: filelock in /usr/local/lib/python3.8/dist-packages (from transformers) (3.9.0)
Collecting tokenizers!=0.11.3,<0.14,>=0.11.1
  Downloading tokenizers-0.13.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (7.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.6/7.6 MB 60.1 MB/s eta 0:00:00
[?25hRequirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.8/dist-packages (from transformers) (23.0)
Requirement already satisfied: typing-extensions>=3.7.4.3 in /usr/local/lib/python3.8/dist-packages (from huggingface-hub<1.0,>=0.11.0->transformers) (4.4.0)
Requirement already satisfied: chardet<5,>=3.0.2 in /usr/local/lib/python3.8/dist-packages (from requests->transformers) (4.0.0)
Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.8/dist-packages (from requests->transformers) (2.10)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.8/dist-packages (from requests->transformers) (2022.12.7)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.8/dist-packages (from requests->transformers) (1.26.14)
Installing collected packages: tokenizers, transformers
Successfully installed tokenizers-0.13.2 transformers-4.26.1
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Requirement already satisfied: scikit-learn in /usr/local/lib/python3.8/dist-packages (1.0.2)
Requirement already satisfied: scipy>=1.1.0 in /usr/local/lib/python3.8/dist-packages (from scikit-learn) (1.7.3)
Requirement already satisfied: numpy>=1.14.6 in /usr/local/lib/python3.8/dist-packages (from scikit-learn) (1.21.6)
Requirement already satisfied: joblib>=0.11 in /usr/local/lib/python3.8/dist-packages (from scikit-learn) (1.2.0)
Requirement already satisfied: threadpoolctl>=2.0.0 in /usr/local/lib/python3.8/dist-packages (from scikit-learn) (3.1.0)
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting evaluate
  Downloading evaluate-0.4.0-py3-none-any.whl (81 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 81.4/81.4 KB 3.5 MB/s eta 0:00:00
[?25hRequirement already satisfied: xxhash in /usr/local/lib/python3.8/dist-packages (from evaluate) (3.2.0)
Requirement already satisfied: responses<0.19 in /usr/local/lib/python3.8/dist-packages (from evaluate) (0.18.0)
Requirement already satisfied: pandas in /usr/local/lib/python3.8/dist-packages (from evaluate) (1.3.5)
Requirement already satisfied: fsspec[http]>=2021.05.0 in /usr/local/lib/python3.8/dist-packages (from evaluate) (2023.1.0)
Requirement already satisfied: packaging in /usr/local/lib/python3.8/dist-packages (from evaluate) (23.0)
Requirement already satisfied: dill in /usr/local/lib/python3.8/dist-packages (from evaluate) (0.3.6)
Requirement already satisfied: datasets>=2.0.0 in /usr/local/lib/python3.8/dist-packages (from evaluate) (2.9.0)
Requirement already satisfied: numpy>=1.17 in /usr/local/lib/python3.8/dist-packages (from evaluate) (1.21.6)
Requirement already satisfied: requests>=2.19.0 in /usr/local/lib/python3.8/dist-packages (from evaluate) (2.25.1)
Requirement already satisfied: huggingface-hub>=0.7.0 in /usr/local/lib/python3.8/dist-packages (from evaluate) (0.12.0)
Requirement already satisfied: multiprocess in /usr/local/lib/python3.8/dist-packages (from evaluate) (0.70.14)
Requirement already satisfied: tqdm>=4.62.1 in /usr/local/lib/python3.8/dist-packages (from evaluate) (4.64.1)
Requirement already satisfied: pyarrow>=6.0.0 in /usr/local/lib/python3.8/dist-packages (from datasets>=2.0.0->evaluate) (9.0.0)
Requirement already satisfied: aiohttp in /usr/local/lib/python3.8/dist-packages (from datasets>=2.0.0->evaluate) (3.8.3)
Requirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.8/dist-packages (from datasets>=2.0.0->evaluate) (6.0)
Requirement already satisfied: filelock in /usr/local/lib/python3.8/dist-packages (from huggingface-hub>=0.7.0->evaluate) (3.9.0)
Requirement already satisfied: typing-extensions>=3.7.4.3 in /usr/local/lib/python3.8/dist-packages (from huggingface-hub>=0.7.0->evaluate) (4.4.0)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.8/dist-packages (from requests>=2.19.0->evaluate) (1.26.14)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.8/dist-packages (from requests>=2.19.0->evaluate) (2022.12.7)
Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.8/dist-packages (from requests>=2.19.0->evaluate) (2.10)
Requirement already satisfied: chardet<5,>=3.0.2 in /usr/local/lib/python3.8/dist-packages (from requests>=2.19.0->evaluate) (4.0.0)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.8/dist-packages (from pandas->evaluate) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/dist-packages (from pandas->evaluate) (2022.7.1)
Requirement already satisfied: attrs>=17.3.0 in /usr/local/lib/python3.8/dist-packages (from aiohttp->datasets>=2.0.0->evaluate) (22.2.0)
Requirement already satisfied: aiosignal>=1.1.2 in /usr/local/lib/python3.8/dist-packages (from aiohttp->datasets>=2.0.0->evaluate) (1.3.1)
Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/lib/python3.8/dist-packages (from aiohttp->datasets>=2.0.0->evaluate) (1.3.3)
Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/lib/python3.8/dist-packages (from aiohttp->datasets>=2.0.0->evaluate) (6.0.4)
Requirement already satisfied: yarl<2.0,>=1.0 in /usr/local/lib/python3.8/dist-packages (from aiohttp->datasets>=2.0.0->evaluate) (1.8.2)
Requirement already satisfied: charset-normalizer<3.0,>=2.0 in /usr/local/lib/python3.8/dist-packages (from aiohttp->datasets>=2.0.0->evaluate) (2.1.1)
Requirement already satisfied: async-timeout<5.0,>=4.0.0a3 in /usr/local/lib/python3.8/dist-packages (from aiohttp->datasets>=2.0.0->evaluate) (4.0.2)
Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.8/dist-packages (from python-dateutil>=2.7.3->pandas->evaluate) (1.15.0)
Installing collected packages: evaluate
Successfully installed evaluate-0.4.0
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting accelerate
  Downloading accelerate-0.16.0-py3-none-any.whl (199 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 199.7/199.7 KB 4.6 MB/s eta 0:00:00
[?25hRequirement already satisfied: numpy>=1.17 in /usr/local/lib/python3.8/dist-packages (from accelerate) (1.21.6)
Requirement already satisfied: torch>=1.4.0 in /usr/local/lib/python3.8/dist-packages (from accelerate) (1.13.1+cu116)
Requirement already satisfied: psutil in /usr/local/lib/python3.8/dist-packages (from accelerate) (5.4.8)
Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.8/dist-packages (from accelerate) (23.0)
Requirement already satisfied: pyyaml in /usr/local/lib/python3.8/dist-packages (from accelerate) (6.0)
Requirement already satisfied: typing-extensions in /usr/local/lib/python3.8/dist-packages (from torch>=1.4.0->accelerate) (4.4.0)
Installing collected packages: accelerate
Successfully installed accelerate-0.16.0
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting sentencepiece
  Downloading sentencepiece-0.1.97-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 18.9 MB/s eta 0:00:00
[?25hInstalling collected packages: sentencepiece
Successfully installed sentencepiece-0.1.97
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Requirement already satisfied: protobuf in /usr/local/lib/python3.8/dist-packages (3.19.6)
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting sacrebleu
  Downloading sacrebleu-2.3.1-py3-none-any.whl (118 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 118.9/118.9 KB 4.4 MB/s eta 0:00:00
[?25hCollecting portalocker
  Downloading portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Requirement already satisfied: regex in /usr/local/lib/python3.8/dist-packages (from sacrebleu) (2022.6.2)
Collecting colorama
  Downloading colorama-0.4.6-py2.py3-none-any.whl (25 kB)
Requirement already satisfied: lxml in /usr/local/lib/python3.8/dist-packages (from sacrebleu) (4.9.2)
Requirement already satisfied: tabulate>=0.8.9 in /usr/local/lib/python3.8/dist-packages (from sacrebleu) (0.8.10)
Requirement already satisfied: numpy>=1.17 in /usr/local/lib/python3.8/dist-packages (from sacrebleu) (1.21.6)
Installing collected packages: portalocker, colorama, sacrebleu
Successfully installed colorama-0.4.6 portalocker-2.7.0 sacrebleu-2.3.1
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting py7zr
  Downloading py7zr-0.20.4-py3-none-any.whl (66 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 66.3/66.3 KB 3.0 MB/s eta 0:00:00
[?25hCollecting brotli>=1.0.9
  Downloading Brotli-1.0.9-cp38-cp38-manylinux1_x86_64.whl (357 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 357.2/357.2 KB 13.4 MB/s eta 0:00:00
[?25hCollecting inflate64>=0.3.1
  Downloading inflate64-0.3.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (94 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 94.5/94.5 KB 9.5 MB/s eta 0:00:00
[?25hRequirement already satisfied: psutil in /usr/local/lib/python3.8/dist-packages (from py7zr) (5.4.8)
Collecting pyzstd>=0.14.4
  Downloading pyzstd-0.15.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (378 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 379.0/379.0 KB 24.0 MB/s eta 0:00:00
[?25hCollecting pycryptodomex>=3.6.6
  Downloading pycryptodomex-3.17-cp35-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.1/2.1 MB 47.8 MB/s eta 0:00:00
[?25hCollecting pyppmd<1.1.0,>=0.18.1
  Downloading pyppmd-1.0.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (139 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 139.7/139.7 KB 13.7 MB/s eta 0:00:00
[?25hCollecting multivolumefile>=0.2.3
  Downloading multivolumefile-0.2.3-py3-none-any.whl (17 kB)
Collecting pybcj>=0.6.0
  Downloading pybcj-1.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (50 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 50.8/50.8 KB 4.9 MB/s eta 0:00:00
[?25hCollecting texttable
  Downloading texttable-1.6.7-py2.py3-none-any.whl (10 kB)
Installing collected packages: texttable, brotli, pyzstd, pyppmd, pycryptodomex, pybcj, multivolumefile, inflate64, py7zr
Successfully installed brotli-1.0.9 inflate64-0.3.1 multivolumefile-0.2.3 py7zr-0.20.4 pybcj-1.0.1 pycryptodomex-3.17 pyppmd-1.0.0 pyzstd-0.15.3 texttable-1.6.7

Imports

import os
import json
import torch
from google.colab import drive
from pathlib import Path
from typing import Dict, List
from datasets import load_dataset
from transformers import T5Tokenizer

Loading data

loaded_data = load_dataset('emotion')
!mkdir -v -p data
train_path = Path('data/train.json')
valid_path = Path('data/valid.json')
test_path = Path('data/test.json')
data_train, data_valid, data_test = [], [], []
Downloading builder script:   0%|          | 0.00/3.97k [00:00<?, ?B/s]
Downloading metadata:   0%|          | 0.00/3.28k [00:00<?, ?B/s]
Downloading readme:   0%|          | 0.00/8.78k [00:00<?, ?B/s]
WARNING:datasets.builder:No config specified, defaulting to: emotion/split
Downloading and preparing dataset emotion/split to /root/.cache/huggingface/datasets/emotion/split/1.0.0/cca5efe2dfeb58c1d098e0f9eeb200e9927d889b5a03c67097275dfb5fe463bd...
Downloading data files:   0%|          | 0/3 [00:00<?, ?it/s]
Downloading data:   0%|          | 0.00/592k [00:00<?, ?B/s]
Downloading data:   0%|          | 0.00/74.0k [00:00<?, ?B/s]
Downloading data:   0%|          | 0.00/74.9k [00:00<?, ?B/s]
Extracting data files:   0%|          | 0/3 [00:00<?, ?it/s]
Generating train split:   0%|          | 0/16000 [00:00<?, ? examples/s]
Generating validation split:   0%|          | 0/2000 [00:00<?, ? examples/s]
Generating test split:   0%|          | 0/2000 [00:00<?, ? examples/s]
Dataset emotion downloaded and prepared to /root/.cache/huggingface/datasets/emotion/split/1.0.0/cca5efe2dfeb58c1d098e0f9eeb200e9927d889b5a03c67097275dfb5fe463bd. Subsequent calls will reuse this data.
  0%|          | 0/3 [00:00<?, ?it/s]
mkdir: created directory 'data'
for source_data, dataset, max_size in [
  (loaded_data['train'], data_train, None),
  (loaded_data['validation'], data_valid, None),
  (loaded_data['test'], data_test, None),
]:
  for i, data in enumerate(source_data):
    if max_size is not None and i >= max_size:
      break
    data_line = {
      'label': int(data['label']),
      'text': data['text'],
    }
    dataset.append(data_line)

print(f'Train: {len(data_train):6d}')
print(f'Valid: {len(data_valid):6d}')
print(f'Test: {len(data_test):6d}')
Train:  16000
Valid:   2000
Test:   2000
MAP_LABEL_TRANSLATION = {
    0: 'sadness',
    1: 'joy',
    2: 'love',
    3: 'anger',
    4: 'fear',
    5: 'surprise',
}
def save_as_translations(original_save_path: Path, data_to_save: List[Dict]) -> None:
    file_name = 's2s-' + original_save_path.name
    file_path = original_save_path.parent / file_name

    print(f'Saving into: {file_path}')
    with open(file_path, 'wt') as f_write:
        for data_line in data_to_save:
            label = data_line['label']
            new_label = MAP_LABEL_TRANSLATION[label]
            data_line['label'] = new_label
            data_line_str = json.dumps(data_line)
            f_write.write(f'{data_line_str}\n')
for file_path, data_to_save in [(train_path, data_train), (valid_path, data_valid), (test_path, data_test)]:
  print(f'Saving into: {file_path}')
  with open(file_path, 'wt') as f_write:
    for data_line in data_to_save:
      data_line_str = json.dumps(data_line)
      f_write.write(f'{data_line_str}\n')
  
  save_as_translations(file_path, data_to_save)
Saving into: data/train.json
Saving into: data/s2s-train.json
Saving into: data/valid.json
Saving into: data/s2s-valid.json
Saving into: data/test.json
Saving into: data/s2s-test.json
!head data/train.json
{"label": 0, "text": "i didnt feel humiliated"}
{"label": 0, "text": "i can go from feeling so hopeless to so damned hopeful just from being around someone who cares and is awake"}
{"label": 3, "text": "im grabbing a minute to post i feel greedy wrong"}
{"label": 2, "text": "i am ever feeling nostalgic about the fireplace i will know that it is still on the property"}
{"label": 3, "text": "i am feeling grouchy"}
{"label": 0, "text": "ive been feeling a little burdened lately wasnt sure why that was"}
{"label": 5, "text": "ive been taking or milligrams or times recommended amount and ive fallen asleep a lot faster but i also feel like so funny"}
{"label": 4, "text": "i feel as confused about life as a teenager or as jaded as a year old man"}
{"label": 1, "text": "i have been with petronas for years i feel that petronas has performed well and made a huge profit"}
{"label": 2, "text": "i feel romantic too"}
!head data/s2s-train.json
{"label": "sadness", "text": "i didnt feel humiliated"}
{"label": "sadness", "text": "i can go from feeling so hopeless to so damned hopeful just from being around someone who cares and is awake"}
{"label": "anger", "text": "im grabbing a minute to post i feel greedy wrong"}
{"label": "love", "text": "i am ever feeling nostalgic about the fireplace i will know that it is still on the property"}
{"label": "anger", "text": "i am feeling grouchy"}
{"label": "sadness", "text": "ive been feeling a little burdened lately wasnt sure why that was"}
{"label": "surprise", "text": "ive been taking or milligrams or times recommended amount and ive fallen asleep a lot faster but i also feel like so funny"}
{"label": "fear", "text": "i feel as confused about life as a teenager or as jaded as a year old man"}
{"label": "joy", "text": "i have been with petronas for years i feel that petronas has performed well and made a huge profit"}
{"label": "love", "text": "i feel romantic too"}
# create tiny datasets for debugging purposes
for file_name in ["s2s-train", "s2s-valid", "s2s-test"]:
  print(f"=== {file_name} ===")
  all_text = Path(f"data/{file_name}.json").read_text().split('\n')
  text = all_text[:250] + all_text[-250:]
  Path(f"data/{file_name}-500.json").write_text("\n".join(text))
=== s2s-train ===
=== s2s-valid ===
=== s2s-test ===
!wc -l data/*
    499 data/s2s-test-500.json
   2000 data/s2s-test.json
    499 data/s2s-train-500.json
  16000 data/s2s-train.json
    499 data/s2s-valid-500.json
   2000 data/s2s-valid.json
   2000 data/test.json
  16000 data/train.json
   2000 data/valid.json
  41497 total

GPU Info

!nvidia-smi
NVIDIA-SMI has failed because it couldn't communicate with the NVIDIA driver. Make sure that the latest NVIDIA driver is installed and running.

os.environ['TOKENIZERS_PARALLELISM'] = 'true'

Run

!wget 'https://git.wmi.amu.edu.pl/s444465/projekt-glebokie/raw/branch/master/run_translation.py' -O 'run_translation.py'
--2023-02-12 20:50:30--  https://git.wmi.amu.edu.pl/s444465/projekt-glebokie/raw/branch/master/run_translation.py
Resolving git.wmi.amu.edu.pl (git.wmi.amu.edu.pl)... 150.254.78.40
Connecting to git.wmi.amu.edu.pl (git.wmi.amu.edu.pl)|150.254.78.40|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 29878 (29K) [text/plain]
Saving to: run_translation.py

run_translation.py  100%[===================>]  29.18K  --.-KB/s    in 0.1s    

2023-02-12 20:50:31 (224 KB/s) - run_translation.py saved [29878/29878]

torch.cuda.empty_cache()
!python run_translation.py \
  --cache_dir .cache_training \
  --model_name_or_path "google/flan-t5-small" \
  --freeze_weights \
  --train_file data/s2s-train.json \
  --validation_file data/s2s-valid.json \
  --test_file data/s2s-test.json \
  --per_device_train_batch_size 8 \
  --per_device_eval_batch_size 8 \
  --source_lang "text" \
  --target_lang "label" \
  --source_prefix "emotion classification" \
  --max_source_length 256 \
  --max_target_length 128 \
  --generation_max_length 128 \
  --do_train \
  --do_eval \
  --do_predict \
  --predict_with_generate \
  --num_train_epochs 1 \
  --overwrite_output_dir \
  --output_dir out/emotion/flan_t5
2023-02-12 20:50:37.909132: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvinfer.so.7: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /usr/local/nvidia/lib:/usr/local/nvidia/lib64
2023-02-12 20:50:37.909340: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer_plugin.so.7'; dlerror: libnvinfer_plugin.so.7: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /usr/local/nvidia/lib:/usr/local/nvidia/lib64
2023-02-12 20:50:37.909367: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Cannot dlopen some TensorRT libraries. If you would like to use Nvidia GPU with TensorRT, please make sure the missing libraries mentioned above are installed properly.
/usr/local/lib/python3.8/dist-packages/torch/cuda/__init__.py:497: UserWarning: Can't initialize NVML
  warnings.warn("Can't initialize NVML")
WARNING:__main__:Process rank: -1, device: cpu, n_gpu: 0distributed training: False, 16-bits training: False
INFO:__main__:Training/evaluation parameters Seq2SeqTrainingArguments(
_n_gpu=0,
adafactor=False,
adam_beta1=0.9,
adam_beta2=0.999,
adam_epsilon=1e-08,
auto_find_batch_size=False,
bf16=False,
bf16_full_eval=False,
data_seed=None,
dataloader_drop_last=False,
dataloader_num_workers=0,
dataloader_pin_memory=True,
ddp_bucket_cap_mb=None,
ddp_find_unused_parameters=None,
ddp_timeout=1800,
debug=[],
deepspeed=None,
disable_tqdm=False,
do_eval=True,
do_predict=True,
do_train=True,
eval_accumulation_steps=None,
eval_delay=0,
eval_steps=None,
evaluation_strategy=no,
fp16=False,
fp16_backend=auto,
fp16_full_eval=False,
fp16_opt_level=O1,
fsdp=[],
fsdp_min_num_params=0,
fsdp_transformer_layer_cls_to_wrap=None,
full_determinism=False,
generation_max_length=128,
generation_num_beams=None,
gradient_accumulation_steps=1,
gradient_checkpointing=False,
greater_is_better=None,
group_by_length=False,
half_precision_backend=auto,
hub_model_id=None,
hub_private_repo=False,
hub_strategy=every_save,
hub_token=<HUB_TOKEN>,
ignore_data_skip=False,
include_inputs_for_metrics=False,
jit_mode_eval=False,
label_names=None,
label_smoothing_factor=0.0,
learning_rate=5e-05,
length_column_name=length,
load_best_model_at_end=False,
local_rank=-1,
log_level=passive,
log_level_replica=passive,
log_on_each_node=True,
logging_dir=out/emotion/flan_t5/runs/Feb12_20-50-42_fa0c2ce94be4,
logging_first_step=False,
logging_nan_inf_filter=True,
logging_steps=500,
logging_strategy=steps,
lr_scheduler_type=linear,
max_grad_norm=1.0,
max_steps=-1,
metric_for_best_model=None,
mp_parameters=,
no_cuda=False,
num_train_epochs=1.0,
optim=adamw_hf,
optim_args=None,
output_dir=out/emotion/flan_t5,
overwrite_output_dir=True,
past_index=-1,
per_device_eval_batch_size=8,
per_device_train_batch_size=8,
predict_with_generate=True,
prediction_loss_only=False,
push_to_hub=False,
push_to_hub_model_id=None,
push_to_hub_organization=None,
push_to_hub_token=<PUSH_TO_HUB_TOKEN>,
ray_scope=last,
remove_unused_columns=True,
report_to=['tensorboard'],
resume_from_checkpoint=None,
run_name=out/emotion/flan_t5,
save_on_each_node=False,
save_steps=500,
save_strategy=steps,
save_total_limit=None,
seed=42,
sharded_ddp=[],
skip_memory_metrics=True,
sortish_sampler=False,
tf32=None,
torch_compile=False,
torch_compile_backend=None,
torch_compile_mode=None,
torchdynamo=None,
tpu_metrics_debug=False,
tpu_num_cores=None,
use_ipex=False,
use_legacy_prediction_loop=False,
use_mps_device=False,
warmup_ratio=0.0,
warmup_steps=0,
weight_decay=0.0,
xpu_backend=None,
)
WARNING:datasets.builder:Using custom data configuration default-c2ce64b8b716f6fa
INFO:datasets.info:Loading Dataset Infos from /usr/local/lib/python3.8/dist-packages/datasets/packaged_modules/json
INFO:datasets.builder:Generating dataset json (/content/.cache_training/json/default-c2ce64b8b716f6fa/0.0.0/0f7e3662623656454fcd2b650f34e886a7db4b9104504885bd462096cc7a9f51)
Downloading and preparing dataset json/default to /content/.cache_training/json/default-c2ce64b8b716f6fa/0.0.0/0f7e3662623656454fcd2b650f34e886a7db4b9104504885bd462096cc7a9f51...
Downloading data files: 100% 3/3 [00:00<00:00, 9546.97it/s]
INFO:datasets.download.download_manager:Downloading took 0.0 min
INFO:datasets.download.download_manager:Checksum Computation took 0.0 min
Extracting data files: 100% 3/3 [00:00<00:00, 1686.94it/s]
INFO:datasets.utils.info_utils:Unable to verify checksums.
INFO:datasets.builder:Generating train split
INFO:datasets.builder:Generating validation split
INFO:datasets.builder:Generating test split
INFO:datasets.utils.info_utils:Unable to verify splits sizes.
Dataset json downloaded and prepared to /content/.cache_training/json/default-c2ce64b8b716f6fa/0.0.0/0f7e3662623656454fcd2b650f34e886a7db4b9104504885bd462096cc7a9f51. Subsequent calls will reuse this data.
100% 3/3 [00:00<00:00, 530.10it/s]
Downloading (…)lve/main/config.json: 100% 1.40k/1.40k [00:00<00:00, 178kB/s]
[INFO|configuration_utils.py:660] 2023-02-12 20:50:43,374 >> loading configuration file config.json from cache at .cache_training/models--google--flan-t5-small/snapshots/9471d3bc4f85c9012776f03c4c00fdfe0d789a95/config.json
[INFO|configuration_utils.py:712] 2023-02-12 20:50:43,378 >> Model config T5Config {
  "_name_or_path": "google/flan-t5-small",
  "architectures": [
    "T5ForConditionalGeneration"
  ],
  "d_ff": 1024,
  "d_kv": 64,
  "d_model": 512,
  "decoder_start_token_id": 0,
  "dense_act_fn": "gelu_new",
  "dropout_rate": 0.1,
  "eos_token_id": 1,
  "feed_forward_proj": "gated-gelu",
  "initializer_factor": 1.0,
  "is_encoder_decoder": true,
  "is_gated_act": true,
  "layer_norm_epsilon": 1e-06,
  "model_type": "t5",
  "n_positions": 512,
  "num_decoder_layers": 8,
  "num_heads": 6,
  "num_layers": 8,
  "output_past": true,
  "pad_token_id": 0,
  "relative_attention_max_distance": 128,
  "relative_attention_num_buckets": 32,
  "task_specific_params": {
    "summarization": {
      "early_stopping": true,
      "length_penalty": 2.0,
      "max_length": 200,
      "min_length": 30,
      "no_repeat_ngram_size": 3,
      "num_beams": 4,
      "prefix": "summarize: "
    },
    "translation_en_to_de": {
      "early_stopping": true,
      "max_length": 300,
      "num_beams": 4,
      "prefix": "translate English to German: "
    },
    "translation_en_to_fr": {
      "early_stopping": true,
      "max_length": 300,
      "num_beams": 4,
      "prefix": "translate English to French: "
    },
    "translation_en_to_ro": {
      "early_stopping": true,
      "max_length": 300,
      "num_beams": 4,
      "prefix": "translate English to Romanian: "
    }
  },
  "tie_word_embeddings": false,
  "transformers_version": "4.26.1",
  "use_cache": true,
  "vocab_size": 32128
}

Downloading (…)okenizer_config.json: 100% 2.54k/2.54k [00:00<00:00, 832kB/s]
Downloading (…)"spiece.model";: 100% 792k/792k [00:00<00:00, 13.5MB/s]
Downloading (…)/main/tokenizer.json: 100% 2.42M/2.42M [00:00<00:00, 11.5MB/s]
Downloading (…)cial_tokens_map.json: 100% 2.20k/2.20k [00:00<00:00, 712kB/s]
[INFO|tokenization_utils_base.py:1802] 2023-02-12 20:50:44,890 >> loading file spiece.model from cache at .cache_training/models--google--flan-t5-small/snapshots/9471d3bc4f85c9012776f03c4c00fdfe0d789a95/spiece.model
[INFO|tokenization_utils_base.py:1802] 2023-02-12 20:50:44,890 >> loading file tokenizer.json from cache at .cache_training/models--google--flan-t5-small/snapshots/9471d3bc4f85c9012776f03c4c00fdfe0d789a95/tokenizer.json
[INFO|tokenization_utils_base.py:1802] 2023-02-12 20:50:44,890 >> loading file added_tokens.json from cache at None
[INFO|tokenization_utils_base.py:1802] 2023-02-12 20:50:44,890 >> loading file special_tokens_map.json from cache at .cache_training/models--google--flan-t5-small/snapshots/9471d3bc4f85c9012776f03c4c00fdfe0d789a95/special_tokens_map.json
[INFO|tokenization_utils_base.py:1802] 2023-02-12 20:50:44,890 >> loading file tokenizer_config.json from cache at .cache_training/models--google--flan-t5-small/snapshots/9471d3bc4f85c9012776f03c4c00fdfe0d789a95/tokenizer_config.json
Downloading (…)"pytorch_model.bin";: 100% 308M/308M [00:03<00:00, 93.0MB/s]
[INFO|modeling_utils.py:2275] 2023-02-12 20:50:48,508 >> loading weights file pytorch_model.bin from cache at .cache_training/models--google--flan-t5-small/snapshots/9471d3bc4f85c9012776f03c4c00fdfe0d789a95/pytorch_model.bin
[INFO|configuration_utils.py:543] 2023-02-12 20:50:48,961 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

[INFO|modeling_utils.py:2857] 2023-02-12 20:50:49,986 >> All model checkpoint weights were used when initializing T5ForConditionalGeneration.

[INFO|modeling_utils.py:2865] 2023-02-12 20:50:49,987 >> All the weights of T5ForConditionalGeneration were initialized from the model checkpoint at google/flan-t5-small.
If your task is similar to the task the model of the checkpoint was trained on, you can already use T5ForConditionalGeneration for predictions without further training.
Downloading (…)neration_config.json: 100% 147/147 [00:00<00:00, 17.9kB/s]
[INFO|configuration_utils.py:507] 2023-02-12 20:50:50,288 >> loading configuration file generation_config.json from cache at .cache_training/models--google--flan-t5-small/snapshots/9471d3bc4f85c9012776f03c4c00fdfe0d789a95/generation_config.json
[INFO|configuration_utils.py:543] 2023-02-12 20:50:50,288 >> Generate config GenerationConfig {
  "_from_model_config": true,
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

INFO:__main__:Freezing encoder weights
INFO:__main__:Using translation prefix: "emotion classification: "
Running tokenizer on train dataset:   0% 0/16 [00:00<?, ?ba/s]INFO:datasets.arrow_dataset:Caching processed dataset at /content/.cache_training/json/default-c2ce64b8b716f6fa/0.0.0/0f7e3662623656454fcd2b650f34e886a7db4b9104504885bd462096cc7a9f51/cache-304c1621e4554a32.arrow
Running tokenizer on train dataset: 100% 16/16 [00:02<00:00,  5.46ba/s]
Running tokenizer on validation dataset:   0% 0/2 [00:00<?, ?ba/s]INFO:datasets.arrow_dataset:Caching processed dataset at /content/.cache_training/json/default-c2ce64b8b716f6fa/0.0.0/0f7e3662623656454fcd2b650f34e886a7db4b9104504885bd462096cc7a9f51/cache-139cd8723ba90bb0.arrow
Running tokenizer on validation dataset: 100% 2/2 [00:00<00:00,  4.73ba/s]
Running tokenizer on prediction dataset:   0% 0/2 [00:00<?, ?ba/s]INFO:datasets.arrow_dataset:Caching processed dataset at /content/.cache_training/json/default-c2ce64b8b716f6fa/0.0.0/0f7e3662623656454fcd2b650f34e886a7db4b9104504885bd462096cc7a9f51/cache-79a6ff323bf45e82.arrow
Running tokenizer on prediction dataset: 100% 2/2 [00:00<00:00,  5.29ba/s]
Downloading builder script: 100% 8.15k/8.15k [00:00<00:00, 4.19MB/s]
Downloading builder script: 100% 4.20k/4.20k [00:00<00:00, 2.23MB/s]
/usr/local/lib/python3.8/dist-packages/transformers/optimization.py:306: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning
  warnings.warn(
[INFO|trainer.py:1650] 2023-02-12 20:50:55,618 >> ***** Running training *****
[INFO|trainer.py:1651] 2023-02-12 20:50:55,619 >>   Num examples = 16000
[INFO|trainer.py:1652] 2023-02-12 20:50:55,619 >>   Num Epochs = 1
[INFO|trainer.py:1653] 2023-02-12 20:50:55,619 >>   Instantaneous batch size per device = 8
[INFO|trainer.py:1654] 2023-02-12 20:50:55,619 >>   Total train batch size (w. parallel, distributed & accumulation) = 8
[INFO|trainer.py:1655] 2023-02-12 20:50:55,619 >>   Gradient Accumulation steps = 1
[INFO|trainer.py:1656] 2023-02-12 20:50:55,619 >>   Total optimization steps = 2000
[INFO|trainer.py:1657] 2023-02-12 20:50:55,621 >>   Number of trainable parameters = 58049216
  0% 0/2000 [00:00<?, ?it/s][WARNING|logging.py:281] 2023-02-12 20:50:55,663 >> You're using a T5TokenizerFast tokenizer. Please note that with a fast tokenizer, using the `__call__` method is faster than using a method to encode the text followed by a call to the `pad` method to get a padded encoding.
{'loss': 0.8478, 'learning_rate': 3.7500000000000003e-05, 'epoch': 0.25}
 25% 500/2000 [15:58<52:46,  2.11s/it][INFO|trainer.py:2709] 2023-02-12 21:06:53,824 >> Saving model checkpoint to out/emotion/flan_t5/checkpoint-500
[INFO|configuration_utils.py:453] 2023-02-12 21:06:53,826 >> Configuration saved in out/emotion/flan_t5/checkpoint-500/config.json
[INFO|configuration_utils.py:336] 2023-02-12 21:06:53,829 >> Configuration saved in out/emotion/flan_t5/checkpoint-500/generation_config.json
[INFO|modeling_utils.py:1704] 2023-02-12 21:06:54,675 >> Model weights saved in out/emotion/flan_t5/checkpoint-500/pytorch_model.bin
[INFO|tokenization_utils_base.py:2160] 2023-02-12 21:06:54,678 >> tokenizer config file saved in out/emotion/flan_t5/checkpoint-500/tokenizer_config.json
[INFO|tokenization_utils_base.py:2167] 2023-02-12 21:06:54,678 >> Special tokens file saved in out/emotion/flan_t5/checkpoint-500/special_tokens_map.json
[INFO|tokenization_t5_fast.py:186] 2023-02-12 21:06:54,770 >> Copy vocab file to out/emotion/flan_t5/checkpoint-500/spiece.model
{'loss': 0.5567, 'learning_rate': 2.5e-05, 'epoch': 0.5}
 50% 1000/2000 [32:23<32:47,  1.97s/it][INFO|trainer.py:2709] 2023-02-12 21:23:19,358 >> Saving model checkpoint to out/emotion/flan_t5/checkpoint-1000
[INFO|configuration_utils.py:453] 2023-02-12 21:23:19,360 >> Configuration saved in out/emotion/flan_t5/checkpoint-1000/config.json
[INFO|configuration_utils.py:336] 2023-02-12 21:23:19,363 >> Configuration saved in out/emotion/flan_t5/checkpoint-1000/generation_config.json
[INFO|modeling_utils.py:1704] 2023-02-12 21:23:20,279 >> Model weights saved in out/emotion/flan_t5/checkpoint-1000/pytorch_model.bin
[INFO|tokenization_utils_base.py:2160] 2023-02-12 21:23:20,281 >> tokenizer config file saved in out/emotion/flan_t5/checkpoint-1000/tokenizer_config.json
[INFO|tokenization_utils_base.py:2167] 2023-02-12 21:23:20,282 >> Special tokens file saved in out/emotion/flan_t5/checkpoint-1000/special_tokens_map.json
[INFO|tokenization_t5_fast.py:186] 2023-02-12 21:23:20,383 >> Copy vocab file to out/emotion/flan_t5/checkpoint-1000/spiece.model
{'loss': 0.5195, 'learning_rate': 1.25e-05, 'epoch': 0.75}
 75% 1500/2000 [48:57<17:59,  2.16s/it][INFO|trainer.py:2709] 2023-02-12 21:39:53,410 >> Saving model checkpoint to out/emotion/flan_t5/checkpoint-1500
[INFO|configuration_utils.py:453] 2023-02-12 21:39:53,412 >> Configuration saved in out/emotion/flan_t5/checkpoint-1500/config.json
[INFO|configuration_utils.py:336] 2023-02-12 21:39:53,415 >> Configuration saved in out/emotion/flan_t5/checkpoint-1500/generation_config.json
[INFO|modeling_utils.py:1704] 2023-02-12 21:39:54,413 >> Model weights saved in out/emotion/flan_t5/checkpoint-1500/pytorch_model.bin
[INFO|tokenization_utils_base.py:2160] 2023-02-12 21:39:54,415 >> tokenizer config file saved in out/emotion/flan_t5/checkpoint-1500/tokenizer_config.json
[INFO|tokenization_utils_base.py:2167] 2023-02-12 21:39:54,415 >> Special tokens file saved in out/emotion/flan_t5/checkpoint-1500/special_tokens_map.json
[INFO|tokenization_t5_fast.py:186] 2023-02-12 21:39:54,507 >> Copy vocab file to out/emotion/flan_t5/checkpoint-1500/spiece.model
{'loss': 0.5205, 'learning_rate': 0.0, 'epoch': 1.0}
100% 2000/2000 [1:05:09<00:00,  1.73s/it][INFO|trainer.py:2709] 2023-02-12 21:56:04,721 >> Saving model checkpoint to out/emotion/flan_t5/checkpoint-2000
[INFO|configuration_utils.py:453] 2023-02-12 21:56:04,725 >> Configuration saved in out/emotion/flan_t5/checkpoint-2000/config.json
[INFO|configuration_utils.py:336] 2023-02-12 21:56:04,729 >> Configuration saved in out/emotion/flan_t5/checkpoint-2000/generation_config.json
[INFO|modeling_utils.py:1704] 2023-02-12 21:56:05,781 >> Model weights saved in out/emotion/flan_t5/checkpoint-2000/pytorch_model.bin
[INFO|tokenization_utils_base.py:2160] 2023-02-12 21:56:05,783 >> tokenizer config file saved in out/emotion/flan_t5/checkpoint-2000/tokenizer_config.json
[INFO|tokenization_utils_base.py:2167] 2023-02-12 21:56:05,783 >> Special tokens file saved in out/emotion/flan_t5/checkpoint-2000/special_tokens_map.json
[INFO|tokenization_t5_fast.py:186] 2023-02-12 21:56:05,926 >> Copy vocab file to out/emotion/flan_t5/checkpoint-2000/spiece.model
[INFO|trainer.py:1901] 2023-02-12 21:56:08,516 >> 

Training completed. Do not forget to share your model on huggingface.co/models =)


{'train_runtime': 3912.895, 'train_samples_per_second': 4.089, 'train_steps_per_second': 0.511, 'train_loss': 0.6111455688476563, 'epoch': 1.0}
100% 2000/2000 [1:05:12<00:00,  1.96s/it]
[INFO|trainer.py:2709] 2023-02-12 21:56:08,524 >> Saving model checkpoint to out/emotion/flan_t5
[INFO|configuration_utils.py:453] 2023-02-12 21:56:08,526 >> Configuration saved in out/emotion/flan_t5/config.json
[INFO|configuration_utils.py:336] 2023-02-12 21:56:08,530 >> Configuration saved in out/emotion/flan_t5/generation_config.json
[INFO|modeling_utils.py:1704] 2023-02-12 21:56:10,374 >> Model weights saved in out/emotion/flan_t5/pytorch_model.bin
[INFO|tokenization_utils_base.py:2160] 2023-02-12 21:56:10,376 >> tokenizer config file saved in out/emotion/flan_t5/tokenizer_config.json
[INFO|tokenization_utils_base.py:2167] 2023-02-12 21:56:10,376 >> Special tokens file saved in out/emotion/flan_t5/special_tokens_map.json
[INFO|tokenization_t5_fast.py:186] 2023-02-12 21:56:10,476 >> Copy vocab file to out/emotion/flan_t5/spiece.model
***** train metrics *****
  epoch                    =        1.0
  train_loss               =     0.6111
  train_runtime            = 1:05:12.89
  train_samples            =      16000
  train_samples_per_second =      4.089
  train_steps_per_second   =      0.511
INFO:__main__:*** Evaluate ***
[INFO|trainer.py:2964] 2023-02-12 21:56:10,500 >> ***** Running Evaluation *****
[INFO|trainer.py:2966] 2023-02-12 21:56:10,500 >>   Num examples = 2000
[INFO|trainer.py:2969] 2023-02-12 21:56:10,501 >>   Batch size = 8
[INFO|configuration_utils.py:543] 2023-02-12 21:56:10,515 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  0% 0/250 [00:00<?, ?it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:11,237 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  1% 2/250 [00:00<01:43,  2.41it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:12,068 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  1% 3/250 [00:01<02:44,  1.51it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:13,080 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  2% 4/250 [00:02<03:11,  1.28it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:14,056 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  2% 5/250 [00:03<03:14,  1.26it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:14,872 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  2% 6/250 [00:04<03:53,  1.04it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:16,164 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  3% 7/250 [00:05<03:58,  1.02it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:17,197 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  3% 8/250 [00:06<03:44,  1.08it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:18,002 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  4% 9/250 [00:07<03:26,  1.17it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:18,698 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  4% 10/250 [00:08<03:15,  1.23it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:19,422 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  4% 11/250 [00:08<03:13,  1.24it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:20,217 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  5% 12/250 [00:09<03:05,  1.28it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:20,933 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  5% 13/250 [00:11<03:51,  1.02it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:22,366 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  6% 14/250 [00:12<03:59,  1.02s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:23,472 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  6% 15/250 [00:13<03:59,  1.02s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:24,494 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  6% 16/250 [00:14<04:12,  1.08s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:25,715 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  7% 17/250 [00:15<03:54,  1.01s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:26,553 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  7% 18/250 [00:16<03:45,  1.03it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:27,446 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  8% 19/250 [00:17<03:48,  1.01it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:28,478 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  8% 20/250 [00:18<03:41,  1.04it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:29,382 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  8% 21/250 [00:19<03:43,  1.02it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:30,386 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  9% 22/250 [00:20<03:37,  1.05it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:31,283 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  9% 23/250 [00:20<03:23,  1.12it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:32,043 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 10% 24/250 [00:21<03:13,  1.17it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:32,813 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 10% 25/250 [00:23<03:56,  1.05s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:34,323 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 10% 26/250 [00:24<04:10,  1.12s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:35,598 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 11% 27/250 [00:25<04:31,  1.22s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:37,050 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 11% 28/250 [00:27<04:28,  1.21s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:38,242 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 12% 29/250 [00:28<04:28,  1.21s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:39,462 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 12% 30/250 [00:29<04:39,  1.27s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:40,865 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 12% 31/250 [00:30<04:32,  1.24s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:42,047 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 13% 32/250 [00:31<04:19,  1.19s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:43,110 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 13% 33/250 [00:32<03:42,  1.03s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:43,752 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 14% 34/250 [00:33<03:31,  1.02it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:44,625 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 14% 35/250 [00:34<03:18,  1.08it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:45,422 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 14% 36/250 [00:34<03:08,  1.13it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:46,204 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 15% 37/250 [00:35<03:16,  1.09it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:47,217 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 15% 38/250 [00:36<03:08,  1.13it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:48,023 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 16% 39/250 [00:37<02:52,  1.22it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:48,680 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 16% 40/250 [00:38<03:09,  1.11it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:49,783 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 16% 41/250 [00:39<02:59,  1.16it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:50,537 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 17% 42/250 [00:40<03:18,  1.05it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:51,715 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 17% 43/250 [00:41<03:25,  1.01it/s][INFO|configuration_utils.py:543] 2023-02-12 21:56:52,799 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 18% 44/250 [00:43<04:03,  1.18s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:54,423 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 18% 45/250 [00:44<04:04,  1.19s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:55,633 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 18% 46/250 [00:45<04:11,  1.23s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:56,958 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 19% 47/250 [00:46<04:07,  1.22s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:58,146 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 19% 48/250 [00:47<03:44,  1.11s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:59,004 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 20% 49/250 [00:48<03:30,  1.05s/it][INFO|configuration_utils.py:543] 2023-02-12 21:56:59,902 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 20% 50/250 [00:49<03:18,  1.01it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:00,767 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 20% 51/250 [00:50<03:10,  1.04it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:01,647 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 21% 52/250 [00:51<02:55,  1.13it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:02,372 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 21% 53/250 [00:52<02:56,  1.12it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:03,288 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 22% 54/250 [00:52<02:50,  1.15it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:04,096 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 22% 55/250 [00:53<02:50,  1.14it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:04,980 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 22% 56/250 [00:54<02:42,  1.19it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:05,738 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 23% 57/250 [00:55<03:08,  1.02it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:07,042 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 23% 58/250 [00:57<03:25,  1.07s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:08,325 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 24% 59/250 [00:58<03:40,  1.15s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:09,677 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 24% 60/250 [00:59<03:52,  1.23s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:11,070 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 24% 61/250 [01:00<03:46,  1.20s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:12,194 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 25% 62/250 [01:01<03:30,  1.12s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:13,131 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 25% 63/250 [01:02<03:06,  1.00it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:13,850 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 26% 64/250 [01:03<02:54,  1.07it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:14,648 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 26% 65/250 [01:04<02:44,  1.13it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:15,414 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 26% 66/250 [01:04<02:35,  1.19it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:16,152 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 27% 67/250 [01:06<02:52,  1.06it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:17,328 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 27% 68/250 [01:07<02:53,  1.05it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:18,306 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 28% 69/250 [01:08<03:08,  1.04s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:19,548 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 28% 70/250 [01:09<02:57,  1.01it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:20,416 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 28% 71/250 [01:10<03:03,  1.03s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:21,537 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 29% 72/250 [01:11<03:19,  1.12s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:22,868 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 29% 73/250 [01:12<03:20,  1.13s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:24,026 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 30% 74/250 [01:13<03:08,  1.07s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:24,966 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 30% 75/250 [01:15<03:20,  1.14s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:26,274 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 30% 76/250 [01:16<03:18,  1.14s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:27,414 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 31% 77/250 [01:16<03:01,  1.05s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:28,234 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 31% 78/250 [01:17<02:47,  1.03it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:29,042 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 32% 79/250 [01:18<02:44,  1.04it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:29,981 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 32% 80/250 [01:19<02:49,  1.00it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:31,061 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 32% 81/250 [01:20<02:50,  1.01s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:32,083 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 33% 82/250 [01:21<02:33,  1.10it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:32,774 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 33% 83/250 [01:22<02:36,  1.07it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:33,769 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 34% 84/250 [01:23<02:28,  1.12it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:34,556 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 34% 85/250 [01:24<02:26,  1.13it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:35,437 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 34% 86/250 [01:25<02:26,  1.12it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:36,339 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 35% 87/250 [01:26<02:31,  1.07it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:37,362 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 35% 88/250 [01:27<02:56,  1.09s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:38,817 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 36% 89/250 [01:28<02:49,  1.05s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:39,795 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 36% 90/250 [01:29<03:01,  1.14s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:41,119 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 36% 91/250 [01:31<03:12,  1.21s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:42,502 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 37% 92/250 [01:32<02:59,  1.14s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:43,471 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 37% 93/250 [01:33<02:53,  1.11s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:44,502 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 38% 94/250 [01:34<02:43,  1.05s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:45,424 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 38% 95/250 [01:34<02:22,  1.09it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:46,044 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 38% 96/250 [01:35<02:16,  1.12it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:46,858 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 39% 97/250 [01:36<02:19,  1.09it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:47,832 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 39% 98/250 [01:37<02:08,  1.18it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:48,516 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 40% 99/250 [01:38<02:21,  1.07it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:49,671 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 40% 100/250 [01:39<02:12,  1.13it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:50,422 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 40% 101/250 [01:40<02:23,  1.04it/s][INFO|configuration_utils.py:543] 2023-02-12 21:57:51,564 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 41% 102/250 [01:42<02:54,  1.18s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:53,246 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 41% 103/250 [01:43<03:04,  1.26s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:54,694 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 42% 104/250 [01:44<03:07,  1.28s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:56,033 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 42% 105/250 [01:45<02:53,  1.20s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:57,030 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 42% 106/250 [01:46<02:29,  1.04s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:57,689 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 43% 107/250 [01:47<02:32,  1.07s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:58,835 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 43% 108/250 [01:48<02:26,  1.03s/it][INFO|configuration_utils.py:543] 2023-02-12 21:57:59,771 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 44% 109/250 [01:49<02:23,  1.02s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:00,758 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 44% 110/250 [01:50<02:15,  1.03it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:01,620 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 44% 111/250 [01:51<02:09,  1.08it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:02,452 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 45% 112/250 [01:51<01:55,  1.20it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:03,069 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 45% 113/250 [01:52<01:54,  1.19it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:03,916 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 46% 114/250 [01:53<02:00,  1.13it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:04,907 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 46% 115/250 [01:54<02:00,  1.12it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:05,831 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 46% 116/250 [01:55<02:07,  1.05it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:06,908 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 47% 117/250 [01:57<02:33,  1.16s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:08,545 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 47% 118/250 [01:58<02:48,  1.27s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:10,094 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 48% 119/250 [02:00<02:44,  1.26s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:11,306 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 48% 120/250 [02:01<02:31,  1.16s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:12,256 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 48% 121/250 [02:02<02:26,  1.13s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:13,316 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 49% 122/250 [02:02<02:16,  1.07s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:14,229 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 49% 123/250 [02:04<02:21,  1.12s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:15,460 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 50% 124/250 [02:05<02:16,  1.09s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:16,472 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 50% 125/250 [02:06<02:18,  1.11s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:17,631 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 50% 126/250 [02:07<02:05,  1.01s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:18,422 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 51% 127/250 [02:08<01:58,  1.04it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:19,270 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 51% 128/250 [02:08<01:53,  1.08it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:20,114 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 52% 129/250 [02:09<01:45,  1.14it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:20,869 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 52% 130/250 [02:10<01:47,  1.12it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:21,801 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 52% 131/250 [02:12<02:12,  1.12s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:23,437 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 53% 132/250 [02:13<02:28,  1.26s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:25,032 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 53% 133/250 [02:14<02:24,  1.23s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:26,206 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 54% 134/250 [02:16<02:17,  1.19s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:27,285 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 54% 135/250 [02:16<02:03,  1.07s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:28,079 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 54% 136/250 [02:17<01:49,  1.04it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:28,781 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 55% 137/250 [02:18<01:47,  1.05it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:29,717 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 55% 138/250 [02:19<01:40,  1.12it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:30,482 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 56% 139/250 [02:20<01:38,  1.13it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:31,354 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 56% 140/250 [02:20<01:31,  1.20it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:32,063 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 56% 141/250 [02:21<01:27,  1.25it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:32,790 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 57% 142/250 [02:22<01:30,  1.19it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:33,716 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 57% 143/250 [02:23<01:29,  1.19it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:34,548 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 58% 144/250 [02:24<01:37,  1.09it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:35,664 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 58% 145/250 [02:25<01:39,  1.06it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:36,661 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 58% 146/250 [02:27<01:58,  1.14s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:38,275 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 59% 147/250 [02:28<02:04,  1.21s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:39,635 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 59% 148/250 [02:29<01:57,  1.15s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:40,646 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 60% 149/250 [02:30<01:52,  1.12s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:41,685 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 60% 150/250 [02:31<01:46,  1.06s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:42,621 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 60% 151/250 [02:32<01:38,  1.00it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:43,467 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 61% 152/250 [02:33<01:36,  1.02it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:44,406 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 61% 153/250 [02:34<01:41,  1.05s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:45,614 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 62% 154/250 [02:35<01:37,  1.02s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:46,557 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 62% 155/250 [02:36<01:34,  1.01it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:47,499 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 62% 156/250 [02:37<01:35,  1.02s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:48,574 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 63% 157/250 [02:38<01:25,  1.09it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:49,244 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 63% 158/250 [02:38<01:18,  1.18it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:49,940 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 64% 159/250 [02:39<01:18,  1.16it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:50,836 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 64% 160/250 [02:40<01:19,  1.13it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:51,779 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 64% 161/250 [02:41<01:17,  1.14it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:52,629 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 65% 162/250 [02:42<01:27,  1.00it/s][INFO|configuration_utils.py:543] 2023-02-12 21:58:53,909 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 65% 163/250 [02:44<01:45,  1.21s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:55,624 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 66% 164/250 [02:45<01:48,  1.26s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:57,011 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 66% 165/250 [02:46<01:40,  1.18s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:57,990 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 66% 166/250 [02:47<01:27,  1.04s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:58,715 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 67% 167/250 [02:48<01:28,  1.07s/it][INFO|configuration_utils.py:543] 2023-02-12 21:58:59,846 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 67% 168/250 [02:49<01:20,  1.02it/s][INFO|configuration_utils.py:543] 2023-02-12 21:59:00,612 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 68% 169/250 [02:50<01:12,  1.11it/s][INFO|configuration_utils.py:543] 2023-02-12 21:59:01,325 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 68% 170/250 [02:51<01:16,  1.04it/s][INFO|configuration_utils.py:543] 2023-02-12 21:59:02,418 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 68% 171/250 [02:52<01:15,  1.05it/s][INFO|configuration_utils.py:543] 2023-02-12 21:59:03,374 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 69% 172/250 [02:53<01:21,  1.05s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:04,640 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 69% 173/250 [02:54<01:21,  1.06s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:05,734 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 70% 174/250 [02:55<01:25,  1.13s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:07,005 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 70% 175/250 [02:56<01:24,  1.13s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:08,157 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 70% 176/250 [02:58<01:33,  1.26s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:09,722 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 71% 177/250 [02:59<01:32,  1.27s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:10,997 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 71% 178/250 [03:01<01:33,  1.29s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:12,350 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 72% 179/250 [03:02<01:26,  1.22s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:13,395 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 72% 180/250 [03:02<01:15,  1.09s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:14,170 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 72% 181/250 [03:03<01:10,  1.02s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:15,021 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 73% 182/250 [03:04<01:05,  1.03it/s][INFO|configuration_utils.py:543] 2023-02-12 21:59:15,887 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 73% 183/250 [03:05<01:08,  1.03s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:17,053 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 74% 184/250 [03:06<01:01,  1.07it/s][INFO|configuration_utils.py:543] 2023-02-12 21:59:17,781 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 74% 185/250 [03:07<01:02,  1.03it/s][INFO|configuration_utils.py:543] 2023-02-12 21:59:18,813 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 74% 186/250 [03:08<01:00,  1.06it/s][INFO|configuration_utils.py:543] 2023-02-12 21:59:19,688 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 75% 187/250 [03:09<00:55,  1.14it/s][INFO|configuration_utils.py:543] 2023-02-12 21:59:20,416 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 75% 188/250 [03:10<00:56,  1.09it/s][INFO|configuration_utils.py:543] 2023-02-12 21:59:21,419 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 76% 189/250 [03:11<00:58,  1.04it/s][INFO|configuration_utils.py:543] 2023-02-12 21:59:22,500 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 76% 190/250 [03:12<01:06,  1.10s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:23,929 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 76% 191/250 [03:14<01:14,  1.26s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:25,568 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 77% 192/250 [03:15<01:17,  1.34s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:27,099 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 77% 193/250 [03:17<01:15,  1.33s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:28,378 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 78% 194/250 [03:18<01:06,  1.19s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:29,253 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 78% 195/250 [03:18<01:01,  1.12s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:30,222 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 78% 196/250 [03:19<00:55,  1.02s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:31,005 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 79% 197/250 [03:20<00:53,  1.00s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:31,965 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 79% 198/250 [03:21<00:53,  1.03s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:33,071 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 80% 199/250 [03:22<00:51,  1.00s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:34,007 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 80% 200/250 [03:23<00:51,  1.03s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:35,103 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 80% 201/250 [03:24<00:49,  1.01s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:36,058 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 81% 202/250 [03:28<01:22,  1.73s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:39,490 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 81% 203/250 [03:31<01:37,  2.07s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:42,333 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 82% 204/250 [03:32<01:30,  1.97s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:44,070 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 82% 205/250 [03:34<01:19,  1.78s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:45,390 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 82% 206/250 [03:35<01:07,  1.52s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:46,326 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 83% 207/250 [03:36<00:57,  1.34s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:47,248 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 83% 208/250 [03:36<00:50,  1.20s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:48,121 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 84% 209/250 [03:38<00:51,  1.24s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:49,464 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 84% 210/250 [03:39<00:47,  1.18s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:50,482 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 84% 211/250 [03:40<00:44,  1.13s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:51,509 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 85% 212/250 [03:41<00:44,  1.17s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:52,752 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 85% 213/250 [03:42<00:39,  1.06s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:53,582 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 86% 214/250 [03:43<00:37,  1.03s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:54,528 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 86% 215/250 [03:44<00:38,  1.11s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:55,839 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 86% 216/250 [03:45<00:37,  1.11s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:56,939 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 87% 217/250 [03:47<00:38,  1.18s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:58,290 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 87% 218/250 [03:48<00:39,  1.22s/it][INFO|configuration_utils.py:543] 2023-02-12 21:59:59,611 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 88% 219/250 [03:50<00:45,  1.46s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:01,634 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 88% 220/250 [03:51<00:38,  1.29s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:02,510 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 88% 221/250 [03:51<00:31,  1.09s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:03,138 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 89% 222/250 [03:52<00:28,  1.01s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:03,958 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 89% 223/250 [03:53<00:25,  1.05it/s][INFO|configuration_utils.py:543] 2023-02-12 22:00:04,782 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 90% 224/250 [03:54<00:27,  1.04s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:06,027 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 90% 225/250 [03:56<00:28,  1.13s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:07,350 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 90% 226/250 [03:57<00:26,  1.09s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:08,349 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 91% 227/250 [03:58<00:24,  1.07s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:09,373 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 91% 228/250 [03:59<00:22,  1.03s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:10,324 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 92% 229/250 [04:00<00:21,  1.02s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:11,303 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 92% 230/250 [04:01<00:23,  1.19s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:12,910 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 92% 231/250 [04:03<00:27,  1.46s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:14,996 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 93% 232/250 [04:05<00:28,  1.57s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:16,826 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 93% 233/250 [04:06<00:22,  1.33s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:17,590 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 94% 234/250 [04:07<00:19,  1.19s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:18,455 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 94% 235/250 [04:08<00:16,  1.09s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:19,297 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 94% 236/250 [04:09<00:15,  1.09s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:20,381 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 95% 237/250 [04:10<00:14,  1.12s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:21,566 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 95% 238/250 [04:11<00:12,  1.07s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:22,523 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 96% 239/250 [04:12<00:11,  1.00s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:23,371 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 96% 240/250 [04:13<00:09,  1.03it/s][INFO|configuration_utils.py:543] 2023-02-12 22:00:24,261 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 96% 241/250 [04:13<00:08,  1.05it/s][INFO|configuration_utils.py:543] 2023-02-12 22:00:25,191 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 97% 242/250 [04:14<00:07,  1.06it/s][INFO|configuration_utils.py:543] 2023-02-12 22:00:26,094 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 97% 243/250 [04:16<00:07,  1.03s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:27,323 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 98% 244/250 [04:17<00:06,  1.16s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:28,810 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 98% 245/250 [04:18<00:05,  1.18s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:30,042 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 98% 246/250 [04:20<00:05,  1.39s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:31,897 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 99% 247/250 [04:21<00:03,  1.23s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:32,749 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 99% 248/250 [04:22<00:02,  1.14s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:33,684 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

100% 249/250 [04:23<00:01,  1.10s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:34,683 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

100% 250/250 [04:25<00:00,  1.06s/it]
***** eval metrics *****
  epoch                   =        1.0
  eval_accuracy           =        1.0
  eval_bleu               =        0.0
  eval_gen_len            =        2.0
  eval_loss               =     0.4009
  eval_runtime            = 0:04:25.88
  eval_samples            =       2000
  eval_samples_per_second =      7.522
  eval_steps_per_second   =       0.94
INFO:__main__:*** Predict ***
[INFO|trainer.py:2964] 2023-02-12 22:00:36,388 >> ***** Running Prediction *****
[INFO|trainer.py:2966] 2023-02-12 22:00:36,388 >>   Num examples = 2000
[INFO|trainer.py:2969] 2023-02-12 22:00:36,388 >>   Batch size = 8
[INFO|configuration_utils.py:543] 2023-02-12 22:00:36,399 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  0% 0/250 [00:00<?, ?it/s][INFO|configuration_utils.py:543] 2023-02-12 22:00:37,489 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  1% 2/250 [00:01<02:04,  1.99it/s][INFO|configuration_utils.py:543] 2023-02-12 22:00:38,494 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  1% 3/250 [00:02<03:05,  1.33it/s][INFO|configuration_utils.py:543] 2023-02-12 22:00:39,597 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  2% 4/250 [00:03<03:19,  1.24it/s][INFO|configuration_utils.py:543] 2023-02-12 22:00:40,501 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  2% 5/250 [00:03<03:27,  1.18it/s][INFO|configuration_utils.py:543] 2023-02-12 22:00:41,414 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  2% 6/250 [00:04<03:40,  1.10it/s][INFO|configuration_utils.py:543] 2023-02-12 22:00:42,443 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  3% 7/250 [00:06<04:22,  1.08s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:43,888 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  3% 8/250 [00:08<05:07,  1.27s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:45,581 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  4% 9/250 [00:09<05:13,  1.30s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:46,939 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  4% 10/250 [00:10<04:31,  1.13s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:47,694 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  4% 11/250 [00:11<04:16,  1.07s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:48,626 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  5% 12/250 [00:11<03:54,  1.02it/s][INFO|configuration_utils.py:543] 2023-02-12 22:00:49,410 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  5% 13/250 [00:12<03:44,  1.06it/s][INFO|configuration_utils.py:543] 2023-02-12 22:00:50,272 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  6% 14/250 [00:13<03:45,  1.05it/s][INFO|configuration_utils.py:543] 2023-02-12 22:00:51,245 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  6% 15/250 [00:15<04:05,  1.05s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:52,503 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  6% 16/250 [00:15<03:56,  1.01s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:53,433 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  7% 17/250 [00:16<03:48,  1.02it/s][INFO|configuration_utils.py:543] 2023-02-12 22:00:54,336 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  7% 18/250 [00:18<04:16,  1.11s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:55,737 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  8% 19/250 [00:19<03:58,  1.03s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:56,593 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  8% 20/250 [00:20<04:16,  1.12s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:57,915 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  8% 21/250 [00:21<04:20,  1.14s/it][INFO|configuration_utils.py:543] 2023-02-12 22:00:59,102 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  9% 22/250 [00:23<05:01,  1.32s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:00,851 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

  9% 23/250 [00:24<04:56,  1.31s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:02,118 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 10% 24/250 [00:25<04:33,  1.21s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:03,101 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 10% 25/250 [00:26<04:08,  1.10s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:03,960 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 10% 26/250 [00:27<04:06,  1.10s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:05,048 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 11% 27/250 [00:28<03:49,  1.03s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:05,920 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 11% 28/250 [00:29<03:40,  1.00it/s][INFO|configuration_utils.py:543] 2023-02-12 22:01:06,833 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 12% 29/250 [00:30<03:36,  1.02it/s][INFO|configuration_utils.py:543] 2023-02-12 22:01:07,771 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 12% 30/250 [00:31<03:43,  1.02s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:08,879 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 12% 31/250 [00:32<03:37,  1.01it/s][INFO|configuration_utils.py:543] 2023-02-12 22:01:09,820 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 13% 32/250 [00:33<03:28,  1.05it/s][INFO|configuration_utils.py:543] 2023-02-12 22:01:10,687 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 13% 33/250 [00:34<03:38,  1.01s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:11,819 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 14% 34/250 [00:35<04:14,  1.18s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:13,391 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 14% 35/250 [00:37<04:23,  1.23s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:14,731 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 14% 36/250 [00:38<04:37,  1.30s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:16,198 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 15% 37/250 [00:39<04:32,  1.28s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:17,435 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 15% 38/250 [00:41<04:20,  1.23s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:18,549 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 16% 39/250 [00:41<03:50,  1.09s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:19,311 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 16% 40/250 [00:42<03:45,  1.07s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:20,348 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 16% 41/250 [00:43<03:33,  1.02s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:21,249 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 17% 42/250 [00:44<03:20,  1.04it/s][INFO|configuration_utils.py:543] 2023-02-12 22:01:22,078 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 17% 43/250 [00:45<03:31,  1.02s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:23,229 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 18% 44/250 [00:46<03:24,  1.01it/s][INFO|configuration_utils.py:543] 2023-02-12 22:01:24,154 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 18% 45/250 [00:47<03:22,  1.01it/s][INFO|configuration_utils.py:543] 2023-02-12 22:01:25,132 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 18% 46/250 [00:48<03:16,  1.04it/s][INFO|configuration_utils.py:543] 2023-02-12 22:01:26,043 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 19% 47/250 [00:49<03:05,  1.09it/s][INFO|configuration_utils.py:543] 2023-02-12 22:01:26,837 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 19% 48/250 [00:50<03:37,  1.07s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:28,294 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 20% 49/250 [00:52<03:51,  1.15s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:29,627 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 20% 50/250 [00:53<03:54,  1.17s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:30,844 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 20% 51/250 [00:55<04:36,  1.39s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:32,739 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 21% 52/250 [00:56<04:10,  1.27s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:33,715 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 21% 53/250 [00:57<03:45,  1.15s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:34,581 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 22% 54/250 [00:58<03:45,  1.15s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:35,744 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 22% 55/250 [00:59<03:57,  1.22s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:37,121 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 22% 56/250 [01:00<03:53,  1.20s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:38,292 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 23% 57/250 [01:01<03:35,  1.12s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:39,209 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 23% 58/250 [01:02<03:27,  1.08s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:40,202 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 24% 59/250 [01:03<03:21,  1.06s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:41,205 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 24% 60/250 [01:04<03:26,  1.09s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:42,357 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 24% 61/250 [01:05<03:23,  1.08s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:43,423 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 25% 62/250 [01:07<03:50,  1.23s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:44,999 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 25% 63/250 [01:08<03:47,  1.22s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:46,190 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 26% 64/250 [01:10<04:05,  1.32s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:47,751 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 26% 65/250 [01:11<04:15,  1.38s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:49,265 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 26% 66/250 [01:12<03:48,  1.24s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:50,192 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 27% 67/250 [01:13<03:40,  1.20s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:51,305 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 27% 68/250 [01:14<03:28,  1.15s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:52,312 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 28% 69/250 [01:15<03:17,  1.09s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:53,280 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 28% 70/250 [01:16<03:13,  1.07s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:54,309 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 28% 71/250 [01:17<03:09,  1.06s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:55,338 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 29% 72/250 [01:18<02:56,  1.01it/s][INFO|configuration_utils.py:543] 2023-02-12 22:01:56,178 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 29% 73/250 [01:19<02:45,  1.07it/s][INFO|configuration_utils.py:543] 2023-02-12 22:01:56,970 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 30% 74/250 [01:20<03:03,  1.04s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:58,269 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 30% 75/250 [01:22<03:13,  1.10s/it][INFO|configuration_utils.py:543] 2023-02-12 22:01:59,522 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 30% 76/250 [01:23<03:39,  1.26s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:01,152 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 31% 77/250 [01:25<04:25,  1.53s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:03,314 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 31% 78/250 [01:27<04:23,  1.53s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:04,836 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 32% 79/250 [01:28<03:41,  1.30s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:05,592 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 32% 80/250 [01:29<03:20,  1.18s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:06,493 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 32% 81/250 [01:30<03:24,  1.21s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:07,774 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 33% 82/250 [01:31<03:36,  1.29s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:09,247 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 33% 83/250 [01:32<03:26,  1.23s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:10,355 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 34% 84/250 [01:33<03:10,  1.15s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:11,298 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 34% 85/250 [01:34<02:48,  1.02s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:12,035 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 34% 86/250 [01:35<02:40,  1.02it/s][INFO|configuration_utils.py:543] 2023-02-12 22:02:12,914 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 35% 87/250 [01:36<02:42,  1.00it/s][INFO|configuration_utils.py:543] 2023-02-12 22:02:13,946 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 35% 88/250 [01:37<03:05,  1.14s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:15,431 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 36% 89/250 [01:39<03:25,  1.28s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:17,020 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 36% 90/250 [01:41<03:46,  1.42s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:18,762 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 36% 91/250 [01:42<03:42,  1.40s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:20,117 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 37% 92/250 [01:43<03:14,  1.23s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:20,967 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 37% 93/250 [01:44<03:02,  1.16s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:21,962 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 38% 94/250 [01:45<02:51,  1.10s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:22,918 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 38% 95/250 [01:46<02:32,  1.02it/s][INFO|configuration_utils.py:543] 2023-02-12 22:02:23,631 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 38% 96/250 [01:47<02:28,  1.04it/s][INFO|configuration_utils.py:543] 2023-02-12 22:02:24,542 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 39% 97/250 [01:48<02:33,  1.00s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:25,632 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 39% 98/250 [01:49<02:47,  1.10s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:26,978 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 40% 99/250 [01:50<02:34,  1.02s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:27,807 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 40% 100/250 [01:51<02:34,  1.03s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:28,849 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 40% 101/250 [01:53<03:01,  1.22s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:30,513 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 41% 102/250 [01:54<03:33,  1.44s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:32,470 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 41% 103/250 [01:56<03:34,  1.46s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:33,981 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 42% 104/250 [01:58<03:44,  1.54s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:35,693 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 42% 105/250 [01:59<03:13,  1.34s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:36,555 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 42% 106/250 [02:00<02:58,  1.24s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:37,561 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 43% 107/250 [02:01<02:47,  1.17s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:38,589 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 43% 108/250 [02:02<02:42,  1.14s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:39,666 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 44% 109/250 [02:02<02:27,  1.05s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:40,480 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 44% 110/250 [02:03<02:24,  1.03s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:41,478 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 44% 111/250 [02:04<02:19,  1.00s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:42,411 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 45% 112/250 [02:06<02:23,  1.04s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:43,551 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 45% 113/250 [02:06<02:09,  1.06it/s][INFO|configuration_utils.py:543] 2023-02-12 22:02:44,274 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 46% 114/250 [02:07<02:15,  1.01it/s][INFO|configuration_utils.py:543] 2023-02-12 22:02:45,378 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 46% 115/250 [02:09<02:48,  1.25s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:47,220 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 46% 116/250 [02:11<03:02,  1.36s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:48,839 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 47% 117/250 [02:13<03:15,  1.47s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:50,560 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 47% 118/250 [02:14<03:03,  1.39s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:51,767 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 48% 119/250 [02:15<03:06,  1.42s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:53,259 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 48% 120/250 [02:17<02:59,  1.38s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:54,545 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 48% 121/250 [02:18<03:05,  1.44s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:56,138 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 49% 122/250 [02:19<02:54,  1.36s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:57,299 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 49% 123/250 [02:20<02:30,  1.19s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:58,087 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 50% 124/250 [02:21<02:35,  1.23s/it][INFO|configuration_utils.py:543] 2023-02-12 22:02:59,429 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 50% 125/250 [02:22<02:24,  1.15s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:00,397 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 50% 126/250 [02:23<02:19,  1.12s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:01,449 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 51% 127/250 [02:25<02:23,  1.16s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:02,709 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 51% 128/250 [02:27<02:45,  1.36s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:04,515 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 52% 129/250 [02:28<03:06,  1.54s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:06,474 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 52% 130/250 [02:29<02:41,  1.34s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:07,360 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 52% 131/250 [02:30<02:21,  1.19s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:08,182 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 53% 132/250 [02:31<02:23,  1.21s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:09,456 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 53% 133/250 [02:32<02:11,  1.12s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:10,375 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 54% 134/250 [02:33<02:05,  1.08s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:11,350 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 54% 135/250 [02:34<01:59,  1.04s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:12,287 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 54% 136/250 [02:35<01:59,  1.05s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:13,353 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 55% 137/250 [02:36<01:53,  1.00s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:14,259 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 55% 138/250 [02:37<01:40,  1.12it/s][INFO|configuration_utils.py:543] 2023-02-12 22:03:14,904 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 56% 139/250 [02:38<01:40,  1.10it/s][INFO|configuration_utils.py:543] 2023-02-12 22:03:15,835 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 56% 140/250 [02:39<01:34,  1.17it/s][INFO|configuration_utils.py:543] 2023-02-12 22:03:16,575 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 56% 141/250 [02:40<01:59,  1.10s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:18,238 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 57% 142/250 [02:42<02:24,  1.33s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:20,124 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 57% 143/250 [02:44<02:33,  1.43s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:21,786 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 58% 144/250 [02:45<02:26,  1.39s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:23,061 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 58% 145/250 [02:46<02:11,  1.25s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:24,000 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 58% 146/250 [02:47<02:07,  1.22s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:25,161 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 59% 147/250 [02:48<01:58,  1.15s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:26,152 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 59% 148/250 [02:49<01:58,  1.17s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:27,344 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 60% 149/250 [02:50<01:45,  1.05s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:28,109 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 60% 150/250 [02:51<01:41,  1.01s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:29,049 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 60% 151/250 [02:52<01:45,  1.07s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:30,250 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 61% 152/250 [02:53<01:45,  1.08s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:31,347 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 61% 153/250 [02:55<01:50,  1.13s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:32,616 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 62% 154/250 [02:56<02:08,  1.33s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:34,417 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 62% 155/250 [02:58<02:17,  1.44s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:36,114 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 62% 156/250 [02:59<02:11,  1.40s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:37,425 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 63% 157/250 [03:01<02:10,  1.41s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:38,839 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 63% 158/250 [03:02<01:57,  1.28s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:39,818 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 64% 159/250 [03:03<01:45,  1.16s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:40,694 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 64% 160/250 [03:04<01:34,  1.05s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:41,505 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 64% 161/250 [03:04<01:30,  1.02s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:42,433 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 65% 162/250 [03:06<01:33,  1.07s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:43,620 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 65% 163/250 [03:07<01:31,  1.05s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:44,634 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 66% 164/250 [03:08<01:33,  1.09s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:45,808 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 66% 165/250 [03:09<01:25,  1.00s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:46,607 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 66% 166/250 [03:10<01:28,  1.05s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:47,779 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 67% 167/250 [03:11<01:31,  1.11s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:49,017 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 67% 168/250 [03:13<01:41,  1.24s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:50,552 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 68% 169/250 [03:14<01:37,  1.20s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:51,669 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 68% 170/250 [03:16<01:55,  1.45s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:53,689 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 68% 171/250 [03:17<01:45,  1.33s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:54,756 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 69% 172/250 [03:18<01:33,  1.20s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:55,655 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 69% 173/250 [03:18<01:23,  1.08s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:56,452 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 70% 174/250 [03:20<01:25,  1.12s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:57,673 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 70% 175/250 [03:21<01:25,  1.14s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:58,849 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 70% 176/250 [03:22<01:18,  1.06s/it][INFO|configuration_utils.py:543] 2023-02-12 22:03:59,714 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 71% 177/250 [03:23<01:13,  1.00s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:00,583 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 71% 178/250 [03:23<01:09,  1.04it/s][INFO|configuration_utils.py:543] 2023-02-12 22:04:01,457 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 72% 179/250 [03:25<01:12,  1.02s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:02,606 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 72% 180/250 [03:26<01:14,  1.06s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:03,761 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 72% 181/250 [03:27<01:17,  1.13s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:05,046 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 73% 182/250 [03:28<01:20,  1.18s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:06,365 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 73% 183/250 [03:30<01:27,  1.31s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:07,954 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 74% 184/250 [03:31<01:23,  1.27s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:09,137 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 74% 185/250 [03:32<01:14,  1.15s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:10,012 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 74% 186/250 [03:33<01:13,  1.16s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:11,179 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 75% 187/250 [03:34<01:08,  1.08s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:12,095 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 75% 188/250 [03:35<01:03,  1.03s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:13,005 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 76% 189/250 [03:36<00:58,  1.04it/s][INFO|configuration_utils.py:543] 2023-02-12 22:04:13,794 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 76% 190/250 [03:37<01:00,  1.01s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:14,938 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 76% 191/250 [03:38<00:58,  1.02it/s][INFO|configuration_utils.py:543] 2023-02-12 22:04:15,855 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 77% 192/250 [03:39<01:02,  1.09s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:17,174 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 77% 193/250 [03:40<00:59,  1.04s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:18,109 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 78% 194/250 [03:41<00:59,  1.05s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:19,197 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 78% 195/250 [03:43<01:02,  1.14s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:20,553 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 78% 196/250 [03:44<01:11,  1.32s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:22,272 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 79% 197/250 [03:46<01:17,  1.45s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:24,046 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 79% 198/250 [03:47<01:10,  1.36s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:25,184 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 80% 199/250 [03:48<01:05,  1.29s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:26,303 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 80% 200/250 [03:49<00:57,  1.14s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:27,113 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 80% 201/250 [03:50<00:51,  1.05s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:27,928 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 81% 202/250 [03:51<00:54,  1.14s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:29,294 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 81% 203/250 [03:52<00:53,  1.13s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:30,413 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 82% 204/250 [03:54<00:51,  1.13s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:31,531 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 82% 205/250 [03:54<00:47,  1.06s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:32,437 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 82% 206/250 [03:55<00:43,  1.02it/s][INFO|configuration_utils.py:543] 2023-02-12 22:04:33,236 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 83% 207/250 [03:56<00:40,  1.05it/s][INFO|configuration_utils.py:543] 2023-02-12 22:04:34,110 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 83% 208/250 [03:58<00:48,  1.16s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:35,762 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 84% 209/250 [03:59<00:54,  1.32s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:37,462 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 84% 210/250 [04:01<00:58,  1.46s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:39,233 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 84% 211/250 [04:02<00:53,  1.37s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:40,401 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 85% 212/250 [04:04<00:50,  1.33s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:41,624 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 85% 213/250 [04:05<00:46,  1.24s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:42,675 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 86% 214/250 [04:06<00:40,  1.14s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:43,560 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 86% 215/250 [04:07<00:38,  1.09s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:44,528 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 86% 216/250 [04:07<00:33,  1.02it/s][INFO|configuration_utils.py:543] 2023-02-12 22:04:45,263 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 87% 217/250 [04:08<00:33,  1.02s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:46,374 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 87% 218/250 [04:10<00:34,  1.07s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:47,545 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 88% 219/250 [04:11<00:33,  1.09s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:48,685 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 88% 220/250 [04:12<00:32,  1.08s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:49,740 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 88% 221/250 [04:13<00:32,  1.11s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:50,941 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 89% 222/250 [04:14<00:32,  1.17s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:52,240 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 89% 223/250 [04:16<00:33,  1.26s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:53,702 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 90% 224/250 [04:17<00:29,  1.14s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:54,573 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 90% 225/250 [04:18<00:29,  1.19s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:55,878 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 90% 226/250 [04:19<00:27,  1.17s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:56,982 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 91% 227/250 [04:20<00:25,  1.10s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:57,934 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 91% 228/250 [04:21<00:22,  1.01s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:58,736 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 92% 229/250 [04:22<00:22,  1.06s/it][INFO|configuration_utils.py:543] 2023-02-12 22:04:59,907 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 92% 230/250 [04:23<00:19,  1.01it/s][INFO|configuration_utils.py:543] 2023-02-12 22:05:00,745 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 92% 231/250 [04:24<00:18,  1.03it/s][INFO|configuration_utils.py:543] 2023-02-12 22:05:01,677 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 93% 232/250 [04:25<00:17,  1.04it/s][INFO|configuration_utils.py:543] 2023-02-12 22:05:02,602 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 93% 233/250 [04:26<00:18,  1.09s/it][INFO|configuration_utils.py:543] 2023-02-12 22:05:03,991 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 94% 234/250 [04:27<00:16,  1.01s/it][INFO|configuration_utils.py:543] 2023-02-12 22:05:04,803 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 94% 235/250 [04:28<00:14,  1.03it/s][INFO|configuration_utils.py:543] 2023-02-12 22:05:05,691 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 94% 236/250 [04:29<00:16,  1.16s/it][INFO|configuration_utils.py:543] 2023-02-12 22:05:07,285 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 95% 237/250 [04:31<00:18,  1.39s/it][INFO|configuration_utils.py:543] 2023-02-12 22:05:09,217 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 95% 238/250 [04:32<00:15,  1.32s/it][INFO|configuration_utils.py:543] 2023-02-12 22:05:10,377 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 96% 239/250 [04:33<00:13,  1.22s/it][INFO|configuration_utils.py:543] 2023-02-12 22:05:11,351 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 96% 240/250 [04:35<00:12,  1.25s/it][INFO|configuration_utils.py:543] 2023-02-12 22:05:12,667 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 96% 241/250 [04:36<00:11,  1.25s/it][INFO|configuration_utils.py:543] 2023-02-12 22:05:13,925 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 97% 242/250 [04:37<00:09,  1.20s/it][INFO|configuration_utils.py:543] 2023-02-12 22:05:15,010 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 97% 243/250 [04:38<00:08,  1.22s/it][INFO|configuration_utils.py:543] 2023-02-12 22:05:16,263 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 98% 244/250 [04:40<00:07,  1.27s/it][INFO|configuration_utils.py:543] 2023-02-12 22:05:17,664 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 98% 245/250 [04:41<00:05,  1.17s/it][INFO|configuration_utils.py:543] 2023-02-12 22:05:18,591 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 98% 246/250 [04:42<00:04,  1.11s/it][INFO|configuration_utils.py:543] 2023-02-12 22:05:19,580 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 99% 247/250 [04:43<00:03,  1.11s/it][INFO|configuration_utils.py:543] 2023-02-12 22:05:20,666 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

 99% 248/250 [04:44<00:02,  1.27s/it][INFO|configuration_utils.py:543] 2023-02-12 22:05:22,315 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

100% 249/250 [04:46<00:01,  1.29s/it][INFO|configuration_utils.py:543] 2023-02-12 22:05:23,652 >> Generate config GenerationConfig {
  "decoder_start_token_id": 0,
  "eos_token_id": 1,
  "pad_token_id": 0,
  "transformers_version": "4.26.1"
}

100% 250/250 [04:47<00:00,  1.15s/it]
***** predict metrics *****
  predict_accuracy           =        1.0
  predict_bleu               =        0.0
  predict_gen_len            =        2.0
  predict_loss               =     0.3818
  predict_runtime            = 0:04:49.01
  predict_samples            =       2000
  predict_samples_per_second =       6.92
  predict_steps_per_second   =      0.865
[INFO|modelcard.py:449] 2023-02-12 22:05:25,687 >> Dropping the following result as it does not have all the necessary fields:
{'task': {'name': 'Translation', 'type': 'translation'}, 'metrics': [{'name': 'Bleu', 'type': 'bleu', 'value': 0.0}, {'name': 'Accuracy', 'type': 'accuracy', 'value': 1.0}]}

Save model

drive.mount('/content/drive')
!cp -r /content/out /content/drive/MyDrive/models
Mounted at /content/drive