VideoMAE_base_wlasl_100__signer_200ep_coR
This model is a fine-tuned version of MCG-NJU/videomae-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 3.6367
- Accuracy: 0.4793
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 36000
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| 18.6663 | 0.005 | 180 | 4.6585 | 0.0118 |
| 18.5985 | 1.0050 | 360 | 4.6345 | 0.0207 |
| 18.5062 | 2.0050 | 540 | 4.6301 | 0.0148 |
| 18.3482 | 3.0050 | 721 | 4.6212 | 0.0178 |
| 18.3963 | 4.005 | 901 | 4.6182 | 0.0207 |
| 18.2979 | 5.0050 | 1081 | 4.6092 | 0.0237 |
| 18.2053 | 6.0050 | 1261 | 4.6505 | 0.0178 |
| 18.0284 | 7.0050 | 1442 | 4.5878 | 0.0266 |
| 17.9774 | 8.005 | 1622 | 4.5875 | 0.0325 |
| 17.672 | 9.0050 | 1802 | 4.4892 | 0.0355 |
| 17.0746 | 10.0050 | 1982 | 4.3039 | 0.0503 |
| 16.2939 | 11.0050 | 2163 | 4.1299 | 0.0562 |
| 15.2755 | 12.005 | 2343 | 3.8815 | 0.1006 |
| 13.843 | 13.0050 | 2523 | 3.7165 | 0.1213 |
| 12.4153 | 14.0050 | 2703 | 3.4509 | 0.1657 |
| 10.9809 | 15.0050 | 2884 | 3.1755 | 0.2249 |
| 9.5775 | 16.005 | 3064 | 2.9489 | 0.2544 |
| 8.4151 | 17.0050 | 3244 | 2.8276 | 0.3195 |
| 7.0531 | 18.0050 | 3424 | 2.6719 | 0.3225 |
| 5.9795 | 19.0050 | 3605 | 2.6143 | 0.3491 |
| 4.8469 | 20.005 | 3785 | 2.6037 | 0.3521 |
| 3.9384 | 21.0050 | 3965 | 2.4071 | 0.3935 |
| 3.176 | 22.0050 | 4145 | 2.3498 | 0.4260 |
| 2.5533 | 23.0050 | 4326 | 2.2546 | 0.4290 |
| 2.0217 | 24.005 | 4506 | 2.2616 | 0.4408 |
| 1.5032 | 25.0050 | 4686 | 2.2962 | 0.4290 |
| 1.3052 | 26.0050 | 4866 | 2.3613 | 0.4024 |
| 1.0223 | 27.0050 | 5047 | 2.3479 | 0.4290 |
| 0.8638 | 28.005 | 5227 | 2.2887 | 0.4556 |
| 0.7623 | 29.0050 | 5407 | 2.3140 | 0.4497 |
| 0.688 | 30.0050 | 5587 | 2.5885 | 0.4201 |
| 0.4905 | 31.0050 | 5768 | 2.4310 | 0.4527 |
| 0.4466 | 32.005 | 5948 | 2.4134 | 0.4556 |
| 0.4291 | 33.0050 | 6128 | 2.3113 | 0.4734 |
| 0.3286 | 34.0050 | 6308 | 2.6219 | 0.4704 |
| 0.3314 | 35.0050 | 6489 | 2.6444 | 0.4556 |
| 0.2724 | 36.005 | 6669 | 2.6187 | 0.4704 |
| 0.3126 | 37.0050 | 6849 | 2.8826 | 0.4438 |
| 0.2677 | 38.0050 | 7029 | 2.7060 | 0.5059 |
| 0.1616 | 39.0050 | 7210 | 2.7237 | 0.5030 |
| 0.2657 | 40.005 | 7390 | 2.8089 | 0.4822 |
| 0.2641 | 41.0050 | 7570 | 2.8814 | 0.4941 |
| 0.0988 | 42.0050 | 7750 | 3.0111 | 0.5059 |
| 0.1412 | 43.0050 | 7931 | 3.0738 | 0.4763 |
| 0.2132 | 44.005 | 8111 | 3.1186 | 0.4704 |
| 0.2322 | 45.0050 | 8291 | 3.1673 | 0.4734 |
| 0.2135 | 46.0050 | 8471 | 3.0642 | 0.4793 |
| 0.2689 | 47.0050 | 8652 | 3.2337 | 0.4615 |
| 0.2157 | 48.005 | 8832 | 3.2357 | 0.4793 |
| 0.229 | 49.0050 | 9012 | 2.9704 | 0.5237 |
| 0.2051 | 50.0050 | 9192 | 3.1482 | 0.4763 |
| 0.2083 | 51.0050 | 9373 | 3.4135 | 0.4615 |
| 0.2298 | 52.005 | 9553 | 3.3165 | 0.5 |
| 0.1338 | 53.0050 | 9733 | 3.1129 | 0.4852 |
| 0.1476 | 54.0050 | 9913 | 3.2821 | 0.4734 |
| 0.2206 | 55.0050 | 10094 | 3.1688 | 0.5178 |
| 0.1698 | 56.005 | 10274 | 3.3487 | 0.4911 |
| 0.1568 | 57.0050 | 10454 | 3.0934 | 0.5059 |
| 0.1059 | 58.0050 | 10634 | 3.6909 | 0.4763 |
| 0.1377 | 59.0050 | 10815 | 3.4659 | 0.4882 |
| 0.194 | 60.005 | 10995 | 3.8766 | 0.4231 |
| 0.323 | 61.0050 | 11175 | 3.7657 | 0.4438 |
| 0.2116 | 62.0050 | 11355 | 3.5875 | 0.4911 |
| 0.3221 | 63.0050 | 11536 | 3.6262 | 0.4675 |
| 0.2123 | 64.005 | 11716 | 3.2548 | 0.4852 |
| 0.1485 | 65.0050 | 11896 | 3.2315 | 0.5266 |
| 0.1372 | 66.0050 | 12076 | 3.7162 | 0.4793 |
| 0.1303 | 67.0050 | 12257 | 3.5759 | 0.4911 |
| 0.1532 | 68.005 | 12437 | 3.4257 | 0.4882 |
| 0.1345 | 69.0050 | 12617 | 3.6885 | 0.4645 |
| 0.2182 | 70.0050 | 12797 | 3.8118 | 0.4734 |
| 0.2118 | 71.0050 | 12978 | 3.8203 | 0.5030 |
| 0.1931 | 72.005 | 13158 | 3.8547 | 0.4970 |
| 0.1849 | 73.0050 | 13338 | 3.5080 | 0.5237 |
| 0.2151 | 74.0050 | 13518 | 4.1261 | 0.4527 |
| 0.2144 | 75.0050 | 13699 | 3.4703 | 0.5118 |
| 0.2093 | 76.005 | 13879 | 3.5965 | 0.4911 |
| 0.1171 | 77.0050 | 14059 | 3.6071 | 0.5118 |
| 0.1377 | 78.0050 | 14239 | 3.6424 | 0.4970 |
| 0.1658 | 79.0050 | 14420 | 3.7843 | 0.4793 |
| 0.2543 | 80.005 | 14600 | 3.7443 | 0.4970 |
| 0.1849 | 81.0050 | 14780 | 3.9182 | 0.4941 |
| 0.1168 | 82.0050 | 14960 | 3.9852 | 0.4497 |
| 0.1406 | 83.0050 | 15141 | 3.7811 | 0.4970 |
| 0.1608 | 84.005 | 15321 | 3.8168 | 0.4793 |
| 0.1348 | 85.0050 | 15501 | 3.6367 | 0.4793 |
Framework versions
- Transformers 4.46.1
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.20.1
- Downloads last month
- 367
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Shawon16/VideoMAE_base_wlasl_100__signer_200ep_coR
Base model
MCG-NJU/videomae-base