rtdetr-paper-control-300ep-20260305-v1

This model is a fine-tuned version of PekingU/rtdetr_r101vd_coco_o365 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 7.3432
  • Map: 0.4797
  • Map 50: 0.7782
  • Map 75: 0.5303
  • Map Small: 0.0505
  • Map Medium: 0.4796
  • Map Large: 0.56
  • Mar 1: 0.0571
  • Mar 10: 0.3444
  • Mar 100: 0.6572
  • Mar Small: 0.05
  • Mar Medium: 0.6488
  • Mar Large: 0.8162
  • Map Tray: 0.4304
  • Mar 100 Tray: 0.5838
  • Map Cart: 0.529
  • Mar 100 Cart: 0.7306

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch_fused with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: constant
  • num_epochs: 300
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Tray Mar 100 Tray Map Cart Mar 100 Cart
No log 1.0 7 52.0209 0.0002 0.0005 0.0 0.0 0.0004 0.0007 0.0 0.0 0.0256 0.0 0.0225 0.0843 0.0 0.0029 0.0003 0.0484
No log 2.0 14 31.5488 0.0103 0.0265 0.0072 0.0 0.012 0.0227 0.001 0.0176 0.0693 0.0 0.0629 0.1912 0.02 0.0886 0.0005 0.05
No log 3.0 21 28.2528 0.0127 0.0338 0.0072 0.0 0.0097 0.0421 0.0012 0.0177 0.1023 0.0 0.0906 0.3206 0.0242 0.0998 0.0012 0.1048
No log 4.0 28 37.8709 0.0162 0.0515 0.0056 0.0 0.0159 0.0494 0.0007 0.0313 0.0762 0.0 0.0696 0.1971 0.0309 0.104 0.0015 0.0484
No log 5.0 35 37.3820 0.0346 0.1046 0.0173 0.0 0.0353 0.0477 0.0021 0.0411 0.1068 0.0 0.1022 0.1946 0.0655 0.1475 0.0038 0.0661
No log 6.0 42 36.1954 0.0651 0.1492 0.0425 0.0168 0.0656 0.0948 0.0134 0.0581 0.1476 0.05 0.1439 0.223 0.1147 0.2242 0.0155 0.071
No log 7.0 49 30.6148 0.0861 0.1573 0.0883 0.0404 0.0914 0.1376 0.0103 0.078 0.1941 0.2 0.1876 0.3127 0.1555 0.2881 0.0168 0.1
No log 8.0 56 23.8829 0.1194 0.2319 0.1211 0.101 0.1289 0.1371 0.0193 0.1082 0.2528 0.1 0.2396 0.501 0.1878 0.3475 0.051 0.1581
No log 9.0 63 19.4596 0.1697 0.3124 0.1718 0.2427 0.1784 0.1977 0.0266 0.1436 0.3223 0.35 0.3084 0.5701 0.2327 0.3961 0.1067 0.2484
No log 10.0 70 15.2607 0.1917 0.3932 0.1882 0.2427 0.1888 0.3115 0.0362 0.1644 0.378 0.35 0.3633 0.6422 0.2132 0.4061 0.1702 0.35
No log 11.0 77 14.7306 0.2121 0.4 0.225 0.3515 0.2133 0.3162 0.0326 0.1666 0.3806 0.35 0.3747 0.4784 0.2501 0.4353 0.174 0.3258
No log 12.0 84 13.5177 0.1856 0.3542 0.1942 0.0 0.1853 0.2909 0.0224 0.1383 0.3419 0.0 0.3274 0.6181 0.2646 0.4531 0.1065 0.2306
No log 13.0 91 12.6535 0.2237 0.4062 0.2263 0.0 0.2307 0.3097 0.0438 0.1653 0.383 0.0 0.3669 0.6892 0.2829 0.4515 0.1645 0.3145
No log 14.0 98 12.0757 0.2536 0.438 0.2623 0.0 0.2531 0.3565 0.0506 0.1969 0.4104 0.0 0.3942 0.7221 0.3114 0.4966 0.1959 0.3242
No log 15.0 105 11.2874 0.233 0.4118 0.2445 0.0 0.2371 0.3383 0.0364 0.1578 0.3956 0.0 0.3819 0.6627 0.3666 0.5525 0.0994 0.2387
No log 16.0 112 10.8548 0.2832 0.4823 0.2996 0.0 0.2847 0.3631 0.0447 0.2115 0.4504 0.0 0.4371 0.7157 0.3939 0.5557 0.1726 0.3452
No log 17.0 119 10.5296 0.3039 0.5178 0.325 0.0 0.31 0.3528 0.0473 0.2097 0.448 0.0 0.4408 0.5887 0.3865 0.5541 0.2213 0.3419
No log 18.0 126 9.9201 0.3223 0.5444 0.3532 0.101 0.3304 0.3149 0.0541 0.227 0.4706 0.1 0.4641 0.5951 0.4061 0.5669 0.2386 0.3742
No log 19.0 133 10.3674 0.3342 0.5783 0.349 0.101 0.3423 0.3409 0.0556 0.237 0.4733 0.1 0.464 0.651 0.4026 0.5433 0.2658 0.4032
No log 20.0 140 9.8158 0.3266 0.5346 0.3411 0.101 0.3375 0.3533 0.0505 0.2243 0.4896 0.1 0.4798 0.6779 0.4056 0.5711 0.2475 0.4081
No log 21.0 147 9.5132 0.3562 0.5817 0.3936 0.101 0.3662 0.379 0.0521 0.2586 0.4988 0.1 0.4878 0.7127 0.4309 0.5831 0.2814 0.4145
No log 22.0 154 9.4140 0.368 0.6068 0.4151 0.101 0.3649 0.4702 0.0526 0.2695 0.5164 0.1 0.5008 0.8172 0.418 0.573 0.3179 0.4597
No log 23.0 161 9.2565 0.3727 0.623 0.3976 0.101 0.3734 0.4767 0.054 0.2727 0.5187 0.1 0.5045 0.7946 0.4244 0.5665 0.321 0.471
No log 24.0 168 8.9574 0.3983 0.6546 0.4503 0.101 0.4023 0.4551 0.0524 0.2917 0.5378 0.1 0.5212 0.8613 0.4353 0.5676 0.3613 0.5081
No log 25.0 175 8.5775 0.4209 0.6776 0.4786 0.101 0.4178 0.5274 0.06 0.303 0.5542 0.1 0.5403 0.8245 0.4316 0.5923 0.4101 0.5161
No log 26.0 182 8.3658 0.4661 0.7476 0.5103 0.101 0.465 0.5367 0.0638 0.3461 0.6212 0.1 0.6119 0.8005 0.4063 0.5843 0.526 0.6581
No log 27.0 189 8.1695 0.4829 0.7707 0.5387 0.101 0.4884 0.509 0.0584 0.3327 0.637 0.1 0.6282 0.8049 0.413 0.5852 0.5528 0.6887
No log 28.0 196 8.4744 0.468 0.7682 0.5038 0.101 0.4691 0.5257 0.0603 0.3253 0.6179 0.1 0.6072 0.8211 0.3953 0.5665 0.5407 0.6694
No log 29.0 203 7.9397 0.5061 0.8003 0.5454 0.101 0.5051 0.5746 0.0592 0.3478 0.655 0.1 0.6465 0.8181 0.4259 0.5907 0.5863 0.7194
No log 30.0 210 7.9048 0.5079 0.8185 0.5428 0.101 0.5063 0.5854 0.0619 0.3495 0.6593 0.1 0.651 0.8167 0.4138 0.5799 0.602 0.7387
No log 31.0 217 8.0720 0.4907 0.7843 0.5395 0.101 0.4981 0.4826 0.055 0.3456 0.6649 0.1 0.658 0.7956 0.4155 0.5878 0.5659 0.7419
No log 32.0 224 8.0961 0.5034 0.8035 0.5719 0.101 0.5012 0.6458 0.0621 0.3407 0.655 0.1 0.6455 0.8328 0.393 0.5713 0.6139 0.7387
No log 33.0 231 7.9472 0.5262 0.8299 0.6003 0.101 0.5239 0.671 0.0684 0.342 0.6618 0.1 0.6543 0.8044 0.4309 0.5961 0.6214 0.7274
No log 34.0 238 7.8616 0.5284 0.8342 0.5978 0.101 0.5267 0.6558 0.0659 0.3517 0.6733 0.1 0.6656 0.8196 0.4233 0.5886 0.6336 0.7581
No log 35.0 245 7.9081 0.5432 0.8471 0.6349 0.101 0.5417 0.671 0.0649 0.3569 0.6758 0.1 0.6688 0.8088 0.4337 0.5936 0.6526 0.7581
No log 36.0 252 8.1714 0.5404 0.8545 0.6202 0.101 0.5407 0.6592 0.0642 0.3552 0.6631 0.1 0.6549 0.8162 0.4375 0.5811 0.6434 0.7452
No log 37.0 259 7.8066 0.5435 0.841 0.6259 0.101 0.5436 0.5881 0.0617 0.3531 0.6645 0.1 0.6554 0.8392 0.4407 0.5904 0.6463 0.7387
No log 38.0 266 7.8896 0.5251 0.8331 0.6025 0.101 0.5266 0.5869 0.056 0.3511 0.6608 0.1 0.6522 0.8225 0.428 0.586 0.6223 0.7355
No log 39.0 273 7.6938 0.5327 0.8474 0.6161 0.101 0.5324 0.6111 0.0575 0.3509 0.6528 0.1 0.6419 0.8588 0.4409 0.5814 0.6246 0.7242
No log 40.0 280 7.7116 0.5442 0.8495 0.6457 0.101 0.5459 0.6057 0.0579 0.3562 0.6613 0.1 0.6528 0.824 0.4645 0.6 0.6238 0.7226
No log 41.0 287 7.8024 0.5437 0.8543 0.6388 0.101 0.5458 0.5457 0.062 0.3529 0.6596 0.1 0.6499 0.8422 0.4521 0.5852 0.6354 0.7339
No log 42.0 294 7.6427 0.5433 0.8435 0.6397 0.101 0.5472 0.5426 0.0521 0.3493 0.6672 0.1 0.6573 0.8559 0.4678 0.6053 0.6189 0.729
No log 43.0 301 7.5489 0.5314 0.8411 0.6176 0.101 0.5333 0.5614 0.0536 0.3518 0.6603 0.1 0.6506 0.8436 0.4485 0.5915 0.6144 0.729
No log 44.0 308 7.5076 0.5344 0.8317 0.6286 0.101 0.5371 0.5468 0.0601 0.3573 0.668 0.1 0.6589 0.8422 0.4452 0.5989 0.6237 0.7371
No log 45.0 315 7.7148 0.5219 0.8304 0.6008 0.101 0.5246 0.5515 0.0571 0.3494 0.6583 0.1 0.6486 0.8436 0.4332 0.586 0.6106 0.7306
No log 46.0 322 7.6162 0.5108 0.8174 0.5738 0.101 0.5105 0.5635 0.0594 0.3511 0.6538 0.1 0.644 0.8387 0.4182 0.5851 0.6034 0.7226
No log 47.0 329 7.5197 0.5304 0.8283 0.6041 0.101 0.529 0.6265 0.0599 0.3575 0.6725 0.1 0.6625 0.8632 0.4452 0.5966 0.6156 0.7484
No log 48.0 336 7.5130 0.5234 0.8292 0.5979 0.3515 0.5208 0.6144 0.0604 0.3587 0.6655 0.35 0.6548 0.8603 0.4321 0.5907 0.6148 0.7403
No log 49.0 343 7.9138 0.4962 0.7833 0.5739 0.101 0.5032 0.4042 0.0593 0.3344 0.6418 0.1 0.6306 0.851 0.4044 0.5626 0.5879 0.721
No log 50.0 350 7.5024 0.5342 0.8298 0.6249 0.101 0.5336 0.5962 0.0606 0.3548 0.6695 0.1 0.6599 0.851 0.4386 0.5955 0.6297 0.7435
No log 51.0 357 7.5145 0.4965 0.7934 0.5972 0.0505 0.4976 0.5537 0.0621 0.3492 0.6605 0.05 0.6502 0.8554 0.4376 0.5807 0.5553 0.7403
No log 52.0 364 7.4957 0.5143 0.8056 0.6085 0.101 0.5138 0.5452 0.0579 0.3551 0.6725 0.1 0.6637 0.8387 0.4315 0.5982 0.5971 0.7468
No log 53.0 371 7.5725 0.5109 0.8051 0.5893 0.101 0.511 0.5503 0.0596 0.3524 0.662 0.1 0.6519 0.8539 0.4193 0.5966 0.6025 0.7274
No log 54.0 378 7.5533 0.5233 0.8235 0.6034 0.101 0.5218 0.6023 0.0652 0.3531 0.6598 0.1 0.6507 0.8314 0.425 0.5921 0.6215 0.7274
No log 55.0 385 7.3714 0.5347 0.8371 0.613 0.101 0.5345 0.5914 0.061 0.3596 0.6711 0.1 0.6633 0.8176 0.4538 0.6019 0.6156 0.7403
No log 56.0 392 7.4263 0.5078 0.8067 0.584 0.0505 0.5084 0.5391 0.0592 0.3572 0.6642 0.05 0.6544 0.851 0.4286 0.5994 0.587 0.729
No log 57.0 399 7.2185 0.5168 0.8143 0.6013 0.0505 0.5177 0.5529 0.0629 0.356 0.6768 0.05 0.6687 0.8314 0.4255 0.6003 0.608 0.7532
No log 58.0 406 7.4327 0.5198 0.8265 0.6135 0.0505 0.5219 0.5466 0.0619 0.3564 0.6693 0.05 0.6606 0.8343 0.4229 0.5902 0.6167 0.7484
No log 59.0 413 7.2584 0.5303 0.8424 0.6396 0.101 0.5293 0.6028 0.0616 0.3479 0.6725 0.1 0.663 0.8525 0.4445 0.603 0.6161 0.7419
No log 60.0 420 7.3321 0.5164 0.8176 0.609 0.0505 0.5159 0.5792 0.0643 0.3565 0.6633 0.05 0.6534 0.8525 0.4451 0.5944 0.5878 0.7323
No log 61.0 427 7.4019 0.5209 0.8239 0.6145 0.0505 0.5205 0.6069 0.0588 0.3542 0.6601 0.05 0.6529 0.7966 0.4429 0.5928 0.599 0.7274
No log 62.0 434 7.2498 0.5257 0.8223 0.6158 0.0505 0.5253 0.5705 0.0617 0.3581 0.6717 0.05 0.664 0.8176 0.4421 0.595 0.6094 0.7484
No log 63.0 441 7.3263 0.5094 0.806 0.6249 0.0505 0.5089 0.5569 0.0638 0.3569 0.6582 0.05 0.6485 0.8436 0.4176 0.5905 0.6013 0.7258
No log 64.0 448 7.3728 0.5101 0.8177 0.5958 0.0505 0.5099 0.5888 0.0643 0.3524 0.6603 0.05 0.651 0.8358 0.4154 0.5819 0.6049 0.7387
No log 65.0 455 7.4039 0.5064 0.8135 0.5825 0.2515 0.5069 0.5459 0.0575 0.3511 0.6587 0.25 0.6503 0.8132 0.4237 0.5917 0.5891 0.7258
No log 66.0 462 7.3934 0.5249 0.8203 0.6113 0.3515 0.5252 0.5033 0.0643 0.3561 0.6694 0.35 0.6601 0.8392 0.4341 0.6034 0.6156 0.7355
No log 67.0 469 7.5553 0.4979 0.8092 0.5538 0.2515 0.4984 0.543 0.0639 0.3498 0.6533 0.25 0.6448 0.8074 0.4112 0.5743 0.5846 0.7323
No log 68.0 476 7.4750 0.523 0.8388 0.6028 0.101 0.5258 0.4703 0.0529 0.3469 0.6538 0.1 0.6444 0.8314 0.4536 0.5883 0.5923 0.7194
No log 69.0 483 7.3127 0.5208 0.8292 0.5892 0.101 0.5198 0.6035 0.06 0.3498 0.6571 0.1 0.6494 0.8029 0.4318 0.59 0.6098 0.7242
No log 70.0 490 7.3432 0.4797 0.7782 0.5303 0.0505 0.4796 0.56 0.0571 0.3444 0.6572 0.05 0.6488 0.8162 0.4304 0.5838 0.529 0.7306

Framework versions

  • Transformers 5.3.0
  • Pytorch 2.10.0+cu128
  • Datasets 4.6.1
  • Tokenizers 0.22.2
Downloads last month
602
Safetensors
Model size
76.6M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for nielsr/rtdetr-paper-control-300ep-20260305-v1

Finetuned
(10)
this model