!python -c "import monai" || pip install -q "monai-weekly[ignite, tqdm]"MONAI 101 tutorial - MONAI/Ignite
Setup environment
Setup imports
import logging
import numpy as np
import os
from pathlib import Path
import sys
import tempfile
import torch
from monai.apps import MedNISTDataset
from monai.config import print_config
from monai.data import DataLoader
from monai.engines import SupervisedTrainer
from monai.handlers import StatsHandler
from monai.inferers import SimpleInferer
from monai.networks import eval_mode
from monai.networks.nets import densenet121
from monai.transforms import LoadImageD, EnsureChannelFirstD, ScaleIntensityD, Compose
print_config()MONAI version: 1.5.2
Numpy version: 2.4.2
Pytorch version: 2.6.0
MONAI flags: HAS_EXT = False, USE_COMPILED = False, USE_META_DICT = False
MONAI rev id: d18565fb3e4fd8c556707f91ac280a2dc3f681c1
MONAI __file__: /home/<username>/miniforge3/envs/biomonai_ignite/lib/python3.11/site-packages/monai/__init__.py
Optional dependencies:
Pytorch Ignite version: 0.5.3
ITK version: NOT INSTALLED or UNKNOWN VERSION.
Nibabel version: 5.3.3
scikit-image version: 0.26.0
scipy version: 1.17.0
Pillow version: 12.1.1
Tensorboard version: NOT INSTALLED or UNKNOWN VERSION.
gdown version: NOT INSTALLED or UNKNOWN VERSION.
TorchVision version: NOT INSTALLED or UNKNOWN VERSION.
tqdm version: 4.67.3
lmdb version: NOT INSTALLED or UNKNOWN VERSION.
psutil version: 7.2.2
pandas version: 3.0.0
einops version: 0.8.2
transformers version: NOT INSTALLED or UNKNOWN VERSION.
mlflow version: NOT INSTALLED or UNKNOWN VERSION.
pynrrd version: NOT INSTALLED or UNKNOWN VERSION.
clearml version: NOT INSTALLED or UNKNOWN VERSION.
For details about installing the optional dependencies, please visit:
https://docs.monai.io/en/latest/installation.html#installing-the-recommended-dependencies
Setup data directory
You can specify a directory with the MONAI_DATA_DIRECTORY environment variable.
This allows you to save results and reuse downloads.
If not specified a temporary directory will be used.
directory = os.environ.get("MONAI_DATA_DIRECTORY")
if directory is not None:
os.makedirs(directory, exist_ok=True)
root_dir = tempfile.mkdtemp() if directory is None else directory
print(root_dir)/tmp/tmp62hi5aur
Use MONAI transforms to preprocess data
Medical images require specialized methods for I/O, preprocessing, and augmentation. They often follow specific formats, are handled with specific protocols, and the data arrays are often high-dimensional.
In this example, we will perform image loading, data format verification, and intensity scaling with three monai.transforms listed below, and compose a pipeline ready to be used in next steps.
transform = Compose(
[
LoadImageD(keys="image", image_only=True),
EnsureChannelFirstD(keys="image"),
ScaleIntensityD(keys="image"),
]
)Prepare datasets using MONAI Apps
We use MedNISTDataset in MONAI Apps to download a dataset to the specified directory and perform the pre-processing steps in the monai.transforms compose.
The MedNIST dataset was gathered from several sets from TCIA, the RSNA Bone Age Challenge, and the NIH Chest X-ray dataset.
The dataset is kindly made available by Dr. Bradley J. Erickson M.D., Ph.D. (Department of Radiology, Mayo Clinic) under the Creative Commons CC BY-SA 4.0 license.
If you use the MedNIST dataset, please acknowledge the source.
dataset = MedNISTDataset(root_dir=root_dir, transform=transform, section="training", download=True)MedNIST.tar.gz: 59.0MB [00:05, 10.5MB/s]
2026-04-07 23:43:46,759 - INFO - Downloaded: /tmp/tmp62hi5aur/MedNIST.tar.gz
2026-04-07 23:43:46,819 - INFO - Verified 'MedNIST.tar.gz', md5: 0bc7306e7427e00ad1c5526a6677552d.
2026-04-07 23:43:46,820 - INFO - Writing into directory: /tmp/tmp62hi5aur.
Loading dataset: 100%|██████████| 47164/47164 [00:21<00:00, 2146.98it/s]
Define a network and a supervised trainer
To train a model that can perform the classification task, we will use the DenseNet-121 which is known for its performance on the ImageNet dataset.
For a typical supervised training workflow, MONAI provides SupervisedTrainer to define the hyper-parameters.
import torch
torch.cuda.empty_cache()
torch.cuda.reset_peak_memory_stats()max_epochs = 5
device = torch.device("cuda:0" if torch.cuda.device_count() > 0 else "cpu")
model = densenet121(spatial_dims=2, in_channels=1, out_channels=6).to(device)
logging.basicConfig(stream=sys.stdout, level=logging.INFO)
trainer = SupervisedTrainer(
device=device,
max_epochs=max_epochs,
train_data_loader=DataLoader(dataset, batch_size=512, shuffle=True, num_workers=4),
network=model,
optimizer=torch.optim.Adam(model.parameters(), lr=1e-5),
loss_function=torch.nn.CrossEntropyLoss(),
inferer=SimpleInferer(),
train_handlers=StatsHandler(),
)Run the training
trainer.run()INFO:ignite.engine.engine.SupervisedTrainer:Engine run resuming from iteration 0, epoch 0 until 5 epochs
2026-04-07 23:44:13,471 - INFO - Epoch: 1/5, Iter: 1/93 -- label: 1.0000 loss: 1.8065
2026-04-07 23:44:13,600 - INFO - Epoch: 1/5, Iter: 2/93 -- label: 3.0000 loss: 1.7618
2026-04-07 23:44:13,734 - INFO - Epoch: 1/5, Iter: 3/93 -- label: 1.0000 loss: 1.7557
2026-04-07 23:44:13,855 - INFO - Epoch: 1/5, Iter: 4/93 -- label: 3.0000 loss: 1.7129
2026-04-07 23:44:13,976 - INFO - Epoch: 1/5, Iter: 5/93 -- label: 0.0000 loss: 1.6858
2026-04-07 23:44:14,112 - INFO - Epoch: 1/5, Iter: 6/93 -- label: 2.0000 loss: 1.6729
2026-04-07 23:44:14,236 - INFO - Epoch: 1/5, Iter: 7/93 -- label: 2.0000 loss: 1.6180
2026-04-07 23:44:14,369 - INFO - Epoch: 1/5, Iter: 8/93 -- label: 3.0000 loss: 1.6001
2026-04-07 23:44:14,507 - INFO - Epoch: 1/5, Iter: 9/93 -- label: 1.0000 loss: 1.5594
2026-04-07 23:44:14,635 - INFO - Epoch: 1/5, Iter: 10/93 -- label: 3.0000 loss: 1.5540
2026-04-07 23:44:14,760 - INFO - Epoch: 1/5, Iter: 11/93 -- label: 3.0000 loss: 1.5220
2026-04-07 23:44:14,892 - INFO - Epoch: 1/5, Iter: 12/93 -- label: 5.0000 loss: 1.4861
2026-04-07 23:44:15,032 - INFO - Epoch: 1/5, Iter: 13/93 -- label: 2.0000 loss: 1.4655
2026-04-07 23:44:15,158 - INFO - Epoch: 1/5, Iter: 14/93 -- label: 0.0000 loss: 1.4455
2026-04-07 23:44:15,287 - INFO - Epoch: 1/5, Iter: 15/93 -- label: 3.0000 loss: 1.3922
2026-04-07 23:44:15,428 - INFO - Epoch: 1/5, Iter: 16/93 -- label: 4.0000 loss: 1.3939
2026-04-07 23:44:15,545 - INFO - Epoch: 1/5, Iter: 17/93 -- label: 4.0000 loss: 1.3621
2026-04-07 23:44:15,687 - INFO - Epoch: 1/5, Iter: 18/93 -- label: 1.0000 loss: 1.3325
2026-04-07 23:44:15,806 - INFO - Epoch: 1/5, Iter: 19/93 -- label: 5.0000 loss: 1.3319
2026-04-07 23:44:15,927 - INFO - Epoch: 1/5, Iter: 20/93 -- label: 2.0000 loss: 1.3009
2026-04-07 23:44:16,041 - INFO - Epoch: 1/5, Iter: 21/93 -- label: 2.0000 loss: 1.2923
2026-04-07 23:44:16,166 - INFO - Epoch: 1/5, Iter: 22/93 -- label: 0.0000 loss: 1.2470
2026-04-07 23:44:16,297 - INFO - Epoch: 1/5, Iter: 23/93 -- label: 3.0000 loss: 1.2354
2026-04-07 23:44:16,432 - INFO - Epoch: 1/5, Iter: 24/93 -- label: 3.0000 loss: 1.1999
2026-04-07 23:44:16,555 - INFO - Epoch: 1/5, Iter: 25/93 -- label: 2.0000 loss: 1.1891
2026-04-07 23:44:16,693 - INFO - Epoch: 1/5, Iter: 26/93 -- label: 4.0000 loss: 1.1751
2026-04-07 23:44:16,820 - INFO - Epoch: 1/5, Iter: 27/93 -- label: 5.0000 loss: 1.1293
2026-04-07 23:44:16,948 - INFO - Epoch: 1/5, Iter: 28/93 -- label: 5.0000 loss: 1.1083
2026-04-07 23:44:17,078 - INFO - Epoch: 1/5, Iter: 29/93 -- label: 3.0000 loss: 1.1153
2026-04-07 23:44:17,211 - INFO - Epoch: 1/5, Iter: 30/93 -- label: 0.0000 loss: 1.0792
2026-04-07 23:44:17,336 - INFO - Epoch: 1/5, Iter: 31/93 -- label: 3.0000 loss: 1.0616
2026-04-07 23:44:17,464 - INFO - Epoch: 1/5, Iter: 32/93 -- label: 2.0000 loss: 1.0485
2026-04-07 23:44:18,308 - INFO - Epoch: 1/5, Iter: 33/93 -- label: 2.0000 loss: 1.0619
2026-04-07 23:44:18,446 - INFO - Epoch: 1/5, Iter: 34/93 -- label: 4.0000 loss: 1.0336
2026-04-07 23:44:18,576 - INFO - Epoch: 1/5, Iter: 35/93 -- label: 3.0000 loss: 1.0237
2026-04-07 23:44:18,715 - INFO - Epoch: 1/5, Iter: 36/93 -- label: 2.0000 loss: 0.9786
2026-04-07 23:44:18,854 - INFO - Epoch: 1/5, Iter: 37/93 -- label: 4.0000 loss: 0.9577
2026-04-07 23:44:18,978 - INFO - Epoch: 1/5, Iter: 38/93 -- label: 4.0000 loss: 0.9285
2026-04-07 23:44:19,104 - INFO - Epoch: 1/5, Iter: 39/93 -- label: 5.0000 loss: 0.9446
2026-04-07 23:44:19,234 - INFO - Epoch: 1/5, Iter: 40/93 -- label: 4.0000 loss: 0.9400
2026-04-07 23:44:19,358 - INFO - Epoch: 1/5, Iter: 41/93 -- label: 5.0000 loss: 0.9032
2026-04-07 23:44:19,488 - INFO - Epoch: 1/5, Iter: 42/93 -- label: 2.0000 loss: 0.9080
2026-04-07 23:44:19,618 - INFO - Epoch: 1/5, Iter: 43/93 -- label: 1.0000 loss: 0.9156
2026-04-07 23:44:19,746 - INFO - Epoch: 1/5, Iter: 44/93 -- label: 3.0000 loss: 0.8774
2026-04-07 23:44:19,876 - INFO - Epoch: 1/5, Iter: 45/93 -- label: 2.0000 loss: 0.8354
2026-04-07 23:44:20,005 - INFO - Epoch: 1/5, Iter: 46/93 -- label: 0.0000 loss: 0.8617
2026-04-07 23:44:20,137 - INFO - Epoch: 1/5, Iter: 47/93 -- label: 5.0000 loss: 0.8343
2026-04-07 23:44:20,262 - INFO - Epoch: 1/5, Iter: 48/93 -- label: 4.0000 loss: 0.8189
2026-04-07 23:44:20,383 - INFO - Epoch: 1/5, Iter: 49/93 -- label: 2.0000 loss: 0.8099
2026-04-07 23:44:20,522 - INFO - Epoch: 1/5, Iter: 50/93 -- label: 2.0000 loss: 0.7940
2026-04-07 23:44:20,653 - INFO - Epoch: 1/5, Iter: 51/93 -- label: 0.0000 loss: 0.7677
2026-04-07 23:44:20,783 - INFO - Epoch: 1/5, Iter: 52/93 -- label: 4.0000 loss: 0.7766
2026-04-07 23:44:20,911 - INFO - Epoch: 1/5, Iter: 53/93 -- label: 0.0000 loss: 0.7439
2026-04-07 23:44:21,045 - INFO - Epoch: 1/5, Iter: 54/93 -- label: 1.0000 loss: 0.7438
2026-04-07 23:44:21,182 - INFO - Epoch: 1/5, Iter: 55/93 -- label: 4.0000 loss: 0.7534
2026-04-07 23:44:21,322 - INFO - Epoch: 1/5, Iter: 56/93 -- label: 2.0000 loss: 0.7066
2026-04-07 23:44:21,452 - INFO - Epoch: 1/5, Iter: 57/93 -- label: 3.0000 loss: 0.7134
2026-04-07 23:44:21,587 - INFO - Epoch: 1/5, Iter: 58/93 -- label: 0.0000 loss: 0.6872
2026-04-07 23:44:21,710 - INFO - Epoch: 1/5, Iter: 59/93 -- label: 2.0000 loss: 0.6600
2026-04-07 23:44:21,837 - INFO - Epoch: 1/5, Iter: 60/93 -- label: 1.0000 loss: 0.6730
2026-04-07 23:44:21,964 - INFO - Epoch: 1/5, Iter: 61/93 -- label: 3.0000 loss: 0.6613
2026-04-07 23:44:22,099 - INFO - Epoch: 1/5, Iter: 62/93 -- label: 2.0000 loss: 0.6474
2026-04-07 23:44:22,232 - INFO - Epoch: 1/5, Iter: 63/93 -- label: 5.0000 loss: 0.6272
2026-04-07 23:44:22,362 - INFO - Epoch: 1/5, Iter: 64/93 -- label: 5.0000 loss: 0.6315
2026-04-07 23:44:22,496 - INFO - Epoch: 1/5, Iter: 65/93 -- label: 3.0000 loss: 0.5945
2026-04-07 23:44:22,624 - INFO - Epoch: 1/5, Iter: 66/93 -- label: 0.0000 loss: 0.6000
2026-04-07 23:44:22,749 - INFO - Epoch: 1/5, Iter: 67/93 -- label: 1.0000 loss: 0.5922
2026-04-07 23:44:22,887 - INFO - Epoch: 1/5, Iter: 68/93 -- label: 0.0000 loss: 0.5752
2026-04-07 23:44:23,020 - INFO - Epoch: 1/5, Iter: 69/93 -- label: 2.0000 loss: 0.5511
2026-04-07 23:44:23,150 - INFO - Epoch: 1/5, Iter: 70/93 -- label: 1.0000 loss: 0.5684
2026-04-07 23:44:23,265 - INFO - Epoch: 1/5, Iter: 71/93 -- label: 5.0000 loss: 0.5696
2026-04-07 23:44:23,519 - INFO - Epoch: 1/5, Iter: 72/93 -- label: 2.0000 loss: 0.5559
2026-04-07 23:44:23,635 - INFO - Epoch: 1/5, Iter: 73/93 -- label: 5.0000 loss: 0.5106
2026-04-07 23:44:23,757 - INFO - Epoch: 1/5, Iter: 74/93 -- label: 0.0000 loss: 0.5348
2026-04-07 23:44:23,874 - INFO - Epoch: 1/5, Iter: 75/93 -- label: 1.0000 loss: 0.5187
2026-04-07 23:44:23,992 - INFO - Epoch: 1/5, Iter: 76/93 -- label: 4.0000 loss: 0.5244
2026-04-07 23:44:24,107 - INFO - Epoch: 1/5, Iter: 77/93 -- label: 3.0000 loss: 0.5122
2026-04-07 23:44:24,229 - INFO - Epoch: 1/5, Iter: 78/93 -- label: 4.0000 loss: 0.4811
2026-04-07 23:44:24,345 - INFO - Epoch: 1/5, Iter: 79/93 -- label: 5.0000 loss: 0.5062
2026-04-07 23:44:24,462 - INFO - Epoch: 1/5, Iter: 80/93 -- label: 2.0000 loss: 0.4815
2026-04-07 23:44:24,579 - INFO - Epoch: 1/5, Iter: 81/93 -- label: 5.0000 loss: 0.4703
2026-04-07 23:44:24,700 - INFO - Epoch: 1/5, Iter: 82/93 -- label: 1.0000 loss: 0.4819
2026-04-07 23:44:24,817 - INFO - Epoch: 1/5, Iter: 83/93 -- label: 0.0000 loss: 0.4540
2026-04-07 23:44:24,935 - INFO - Epoch: 1/5, Iter: 84/93 -- label: 0.0000 loss: 0.4419
2026-04-07 23:44:25,060 - INFO - Epoch: 1/5, Iter: 85/93 -- label: 1.0000 loss: 0.4625
2026-04-07 23:44:25,183 - INFO - Epoch: 1/5, Iter: 86/93 -- label: 3.0000 loss: 0.4554
2026-04-07 23:44:25,298 - INFO - Epoch: 1/5, Iter: 87/93 -- label: 3.0000 loss: 0.4418
2026-04-07 23:44:25,415 - INFO - Epoch: 1/5, Iter: 88/93 -- label: 4.0000 loss: 0.4405
2026-04-07 23:44:25,529 - INFO - Epoch: 1/5, Iter: 89/93 -- label: 3.0000 loss: 0.4297
2026-04-07 23:44:25,651 - INFO - Epoch: 1/5, Iter: 90/93 -- label: 4.0000 loss: 0.4332
2026-04-07 23:44:25,767 - INFO - Epoch: 1/5, Iter: 91/93 -- label: 2.0000 loss: 0.4239
2026-04-07 23:44:25,893 - INFO - Epoch: 1/5, Iter: 92/93 -- label: 1.0000 loss: 0.3997
2026-04-07 23:44:25,954 - INFO - Epoch: 1/5, Iter: 93/93 -- label: 5.0000 loss: 0.5426
INFO:ignite.engine.engine.SupervisedTrainer:Epoch[1] Complete. Time taken: 00:00:12.928
2026-04-07 23:44:26,287 - INFO - Epoch: 2/5, Iter: 1/93 -- label: 2.0000 loss: 0.3790
2026-04-07 23:44:26,406 - INFO - Epoch: 2/5, Iter: 2/93 -- label: 0.0000 loss: 0.3929
2026-04-07 23:44:26,521 - INFO - Epoch: 2/5, Iter: 3/93 -- label: 0.0000 loss: 0.3860
2026-04-07 23:44:26,636 - INFO - Epoch: 2/5, Iter: 4/93 -- label: 1.0000 loss: 0.3790
2026-04-07 23:44:26,758 - INFO - Epoch: 2/5, Iter: 5/93 -- label: 5.0000 loss: 0.3531
2026-04-07 23:44:26,872 - INFO - Epoch: 2/5, Iter: 6/93 -- label: 1.0000 loss: 0.3514
2026-04-07 23:44:26,987 - INFO - Epoch: 2/5, Iter: 7/93 -- label: 4.0000 loss: 0.3667
2026-04-07 23:44:27,103 - INFO - Epoch: 2/5, Iter: 8/93 -- label: 4.0000 loss: 0.3534
2026-04-07 23:44:27,225 - INFO - Epoch: 2/5, Iter: 9/93 -- label: 4.0000 loss: 0.3482
2026-04-07 23:44:27,338 - INFO - Epoch: 2/5, Iter: 10/93 -- label: 1.0000 loss: 0.3669
2026-04-07 23:44:27,452 - INFO - Epoch: 2/5, Iter: 11/93 -- label: 2.0000 loss: 0.3495
2026-04-07 23:44:27,566 - INFO - Epoch: 2/5, Iter: 12/93 -- label: 5.0000 loss: 0.3391
2026-04-07 23:44:27,694 - INFO - Epoch: 2/5, Iter: 13/93 -- label: 5.0000 loss: 0.3084
2026-04-07 23:44:27,818 - INFO - Epoch: 2/5, Iter: 14/93 -- label: 5.0000 loss: 0.3391
2026-04-07 23:44:27,955 - INFO - Epoch: 2/5, Iter: 15/93 -- label: 0.0000 loss: 0.3397
2026-04-07 23:44:28,069 - INFO - Epoch: 2/5, Iter: 16/93 -- label: 4.0000 loss: 0.3206
2026-04-07 23:44:28,466 - INFO - Epoch: 2/5, Iter: 17/93 -- label: 4.0000 loss: 0.3373
2026-04-07 23:44:28,591 - INFO - Epoch: 2/5, Iter: 18/93 -- label: 2.0000 loss: 0.3087
2026-04-07 23:44:28,705 - INFO - Epoch: 2/5, Iter: 19/93 -- label: 1.0000 loss: 0.3175
2026-04-07 23:44:28,822 - INFO - Epoch: 2/5, Iter: 20/93 -- label: 0.0000 loss: 0.3088
2026-04-07 23:44:28,944 - INFO - Epoch: 2/5, Iter: 21/93 -- label: 4.0000 loss: 0.3077
2026-04-07 23:44:29,059 - INFO - Epoch: 2/5, Iter: 22/93 -- label: 0.0000 loss: 0.2977
2026-04-07 23:44:29,173 - INFO - Epoch: 2/5, Iter: 23/93 -- label: 1.0000 loss: 0.2836
2026-04-07 23:44:29,287 - INFO - Epoch: 2/5, Iter: 24/93 -- label: 1.0000 loss: 0.2692
2026-04-07 23:44:29,409 - INFO - Epoch: 2/5, Iter: 25/93 -- label: 1.0000 loss: 0.2839
2026-04-07 23:44:29,526 - INFO - Epoch: 2/5, Iter: 26/93 -- label: 4.0000 loss: 0.2999
2026-04-07 23:44:29,666 - INFO - Epoch: 2/5, Iter: 27/93 -- label: 3.0000 loss: 0.2840
2026-04-07 23:44:29,795 - INFO - Epoch: 2/5, Iter: 28/93 -- label: 4.0000 loss: 0.2749
2026-04-07 23:44:29,934 - INFO - Epoch: 2/5, Iter: 29/93 -- label: 5.0000 loss: 0.2565
2026-04-07 23:44:30,060 - INFO - Epoch: 2/5, Iter: 30/93 -- label: 0.0000 loss: 0.2851
2026-04-07 23:44:30,174 - INFO - Epoch: 2/5, Iter: 31/93 -- label: 3.0000 loss: 0.2618
2026-04-07 23:44:30,291 - INFO - Epoch: 2/5, Iter: 32/93 -- label: 2.0000 loss: 0.2644
2026-04-07 23:44:30,415 - INFO - Epoch: 2/5, Iter: 33/93 -- label: 1.0000 loss: 0.2313
2026-04-07 23:44:30,532 - INFO - Epoch: 2/5, Iter: 34/93 -- label: 0.0000 loss: 0.2732
2026-04-07 23:44:30,645 - INFO - Epoch: 2/5, Iter: 35/93 -- label: 1.0000 loss: 0.2680
2026-04-07 23:44:30,761 - INFO - Epoch: 2/5, Iter: 36/93 -- label: 5.0000 loss: 0.2431
2026-04-07 23:44:30,883 - INFO - Epoch: 2/5, Iter: 37/93 -- label: 2.0000 loss: 0.2271
2026-04-07 23:44:31,001 - INFO - Epoch: 2/5, Iter: 38/93 -- label: 5.0000 loss: 0.2436
2026-04-07 23:44:31,119 - INFO - Epoch: 2/5, Iter: 39/93 -- label: 5.0000 loss: 0.2396
2026-04-07 23:44:31,255 - INFO - Epoch: 2/5, Iter: 40/93 -- label: 3.0000 loss: 0.2456
2026-04-07 23:44:31,378 - INFO - Epoch: 2/5, Iter: 41/93 -- label: 5.0000 loss: 0.2315
2026-04-07 23:44:31,491 - INFO - Epoch: 2/5, Iter: 42/93 -- label: 1.0000 loss: 0.2122
2026-04-07 23:44:31,606 - INFO - Epoch: 2/5, Iter: 43/93 -- label: 2.0000 loss: 0.2196
2026-04-07 23:44:31,720 - INFO - Epoch: 2/5, Iter: 44/93 -- label: 3.0000 loss: 0.2299
2026-04-07 23:44:31,842 - INFO - Epoch: 2/5, Iter: 45/93 -- label: 2.0000 loss: 0.2146
2026-04-07 23:44:31,957 - INFO - Epoch: 2/5, Iter: 46/93 -- label: 2.0000 loss: 0.2384
2026-04-07 23:44:32,073 - INFO - Epoch: 2/5, Iter: 47/93 -- label: 1.0000 loss: 0.2292
2026-04-07 23:44:32,189 - INFO - Epoch: 2/5, Iter: 48/93 -- label: 5.0000 loss: 0.2227
2026-04-07 23:44:32,311 - INFO - Epoch: 2/5, Iter: 49/93 -- label: 0.0000 loss: 0.2123
2026-04-07 23:44:32,426 - INFO - Epoch: 2/5, Iter: 50/93 -- label: 3.0000 loss: 0.2407
2026-04-07 23:44:32,540 - INFO - Epoch: 2/5, Iter: 51/93 -- label: 5.0000 loss: 0.2166
2026-04-07 23:44:32,654 - INFO - Epoch: 2/5, Iter: 52/93 -- label: 3.0000 loss: 0.2256
2026-04-07 23:44:32,778 - INFO - Epoch: 2/5, Iter: 53/93 -- label: 2.0000 loss: 0.2108
2026-04-07 23:44:32,894 - INFO - Epoch: 2/5, Iter: 54/93 -- label: 4.0000 loss: 0.2110
2026-04-07 23:44:33,189 - INFO - Epoch: 2/5, Iter: 55/93 -- label: 2.0000 loss: 0.2085
2026-04-07 23:44:33,318 - INFO - Epoch: 2/5, Iter: 56/93 -- label: 4.0000 loss: 0.1930
2026-04-07 23:44:33,440 - INFO - Epoch: 2/5, Iter: 57/93 -- label: 1.0000 loss: 0.1896
2026-04-07 23:44:33,554 - INFO - Epoch: 2/5, Iter: 58/93 -- label: 4.0000 loss: 0.1988
2026-04-07 23:44:33,667 - INFO - Epoch: 2/5, Iter: 59/93 -- label: 3.0000 loss: 0.2081
2026-04-07 23:44:33,784 - INFO - Epoch: 2/5, Iter: 60/93 -- label: 1.0000 loss: 0.1890
2026-04-07 23:44:33,906 - INFO - Epoch: 2/5, Iter: 61/93 -- label: 5.0000 loss: 0.2125
2026-04-07 23:44:34,019 - INFO - Epoch: 2/5, Iter: 62/93 -- label: 5.0000 loss: 0.2238
2026-04-07 23:44:34,133 - INFO - Epoch: 2/5, Iter: 63/93 -- label: 1.0000 loss: 0.2029
2026-04-07 23:44:34,247 - INFO - Epoch: 2/5, Iter: 64/93 -- label: 3.0000 loss: 0.1885
2026-04-07 23:44:34,370 - INFO - Epoch: 2/5, Iter: 65/93 -- label: 0.0000 loss: 0.1876
2026-04-07 23:44:34,484 - INFO - Epoch: 2/5, Iter: 66/93 -- label: 3.0000 loss: 0.1594
2026-04-07 23:44:34,598 - INFO - Epoch: 2/5, Iter: 67/93 -- label: 2.0000 loss: 0.1635
2026-04-07 23:44:34,716 - INFO - Epoch: 2/5, Iter: 68/93 -- label: 4.0000 loss: 0.1819
2026-04-07 23:44:34,839 - INFO - Epoch: 2/5, Iter: 69/93 -- label: 5.0000 loss: 0.1893
2026-04-07 23:44:34,956 - INFO - Epoch: 2/5, Iter: 70/93 -- label: 2.0000 loss: 0.1728
2026-04-07 23:44:35,091 - INFO - Epoch: 2/5, Iter: 71/93 -- label: 2.0000 loss: 0.1638
2026-04-07 23:44:35,206 - INFO - Epoch: 2/5, Iter: 72/93 -- label: 5.0000 loss: 0.1700
2026-04-07 23:44:35,327 - INFO - Epoch: 2/5, Iter: 73/93 -- label: 2.0000 loss: 0.1603
2026-04-07 23:44:35,442 - INFO - Epoch: 2/5, Iter: 74/93 -- label: 0.0000 loss: 0.1668
2026-04-07 23:44:35,556 - INFO - Epoch: 2/5, Iter: 75/93 -- label: 2.0000 loss: 0.1714
2026-04-07 23:44:35,669 - INFO - Epoch: 2/5, Iter: 76/93 -- label: 2.0000 loss: 0.1494
2026-04-07 23:44:35,791 - INFO - Epoch: 2/5, Iter: 77/93 -- label: 4.0000 loss: 0.1657
2026-04-07 23:44:35,905 - INFO - Epoch: 2/5, Iter: 78/93 -- label: 2.0000 loss: 0.2040
2026-04-07 23:44:36,020 - INFO - Epoch: 2/5, Iter: 79/93 -- label: 2.0000 loss: 0.1566
2026-04-07 23:44:36,135 - INFO - Epoch: 2/5, Iter: 80/93 -- label: 5.0000 loss: 0.1580
2026-04-07 23:44:36,256 - INFO - Epoch: 2/5, Iter: 81/93 -- label: 2.0000 loss: 0.1509
2026-04-07 23:44:36,370 - INFO - Epoch: 2/5, Iter: 82/93 -- label: 5.0000 loss: 0.1536
2026-04-07 23:44:36,484 - INFO - Epoch: 2/5, Iter: 83/93 -- label: 1.0000 loss: 0.1706
2026-04-07 23:44:36,598 - INFO - Epoch: 2/5, Iter: 84/93 -- label: 1.0000 loss: 0.1762
2026-04-07 23:44:36,729 - INFO - Epoch: 2/5, Iter: 85/93 -- label: 0.0000 loss: 0.1700
2026-04-07 23:44:36,845 - INFO - Epoch: 2/5, Iter: 86/93 -- label: 0.0000 loss: 0.1519
2026-04-07 23:44:36,962 - INFO - Epoch: 2/5, Iter: 87/93 -- label: 3.0000 loss: 0.1410
2026-04-07 23:44:37,084 - INFO - Epoch: 2/5, Iter: 88/93 -- label: 5.0000 loss: 0.1442
2026-04-07 23:44:37,208 - INFO - Epoch: 2/5, Iter: 89/93 -- label: 3.0000 loss: 0.1494
2026-04-07 23:44:37,337 - INFO - Epoch: 2/5, Iter: 90/93 -- label: 2.0000 loss: 0.1404
2026-04-07 23:44:37,450 - INFO - Epoch: 2/5, Iter: 91/93 -- label: 3.0000 loss: 0.1446
2026-04-07 23:44:37,566 - INFO - Epoch: 2/5, Iter: 92/93 -- label: 1.0000 loss: 0.1407
2026-04-07 23:44:37,638 - INFO - Epoch: 2/5, Iter: 93/93 -- label: 4.0000 loss: 0.1750
INFO:ignite.engine.engine.SupervisedTrainer:Epoch[2] Complete. Time taken: 00:00:11.684
2026-04-07 23:44:38,206 - INFO - Epoch: 3/5, Iter: 1/93 -- label: 0.0000 loss: 0.1290
2026-04-07 23:44:38,350 - INFO - Epoch: 3/5, Iter: 2/93 -- label: 1.0000 loss: 0.1277
2026-04-07 23:44:38,479 - INFO - Epoch: 3/5, Iter: 3/93 -- label: 5.0000 loss: 0.1598
2026-04-07 23:44:38,594 - INFO - Epoch: 3/5, Iter: 4/93 -- label: 0.0000 loss: 0.1543
2026-04-07 23:44:38,713 - INFO - Epoch: 3/5, Iter: 5/93 -- label: 3.0000 loss: 0.1271
2026-04-07 23:44:38,834 - INFO - Epoch: 3/5, Iter: 6/93 -- label: 4.0000 loss: 0.1485
2026-04-07 23:44:38,948 - INFO - Epoch: 3/5, Iter: 7/93 -- label: 5.0000 loss: 0.1597
2026-04-07 23:44:39,063 - INFO - Epoch: 3/5, Iter: 8/93 -- label: 5.0000 loss: 0.1380
2026-04-07 23:44:39,182 - INFO - Epoch: 3/5, Iter: 9/93 -- label: 3.0000 loss: 0.1514
2026-04-07 23:44:39,304 - INFO - Epoch: 3/5, Iter: 10/93 -- label: 4.0000 loss: 0.1584
2026-04-07 23:44:39,435 - INFO - Epoch: 3/5, Iter: 11/93 -- label: 5.0000 loss: 0.1358
2026-04-07 23:44:39,553 - INFO - Epoch: 3/5, Iter: 12/93 -- label: 4.0000 loss: 0.1366
2026-04-07 23:44:39,725 - INFO - Epoch: 3/5, Iter: 13/93 -- label: 5.0000 loss: 0.1262
2026-04-07 23:44:39,913 - INFO - Epoch: 3/5, Iter: 14/93 -- label: 0.0000 loss: 0.1208
2026-04-07 23:44:40,033 - INFO - Epoch: 3/5, Iter: 15/93 -- label: 3.0000 loss: 0.1182
2026-04-07 23:44:40,150 - INFO - Epoch: 3/5, Iter: 16/93 -- label: 3.0000 loss: 0.1378
2026-04-07 23:44:40,283 - INFO - Epoch: 3/5, Iter: 17/93 -- label: 2.0000 loss: 0.1188
2026-04-07 23:44:40,402 - INFO - Epoch: 3/5, Iter: 18/93 -- label: 5.0000 loss: 0.1414
2026-04-07 23:44:40,515 - INFO - Epoch: 3/5, Iter: 19/93 -- label: 3.0000 loss: 0.1206
2026-04-07 23:44:40,634 - INFO - Epoch: 3/5, Iter: 20/93 -- label: 4.0000 loss: 0.1117
2026-04-07 23:44:40,756 - INFO - Epoch: 3/5, Iter: 21/93 -- label: 0.0000 loss: 0.1166
2026-04-07 23:44:40,876 - INFO - Epoch: 3/5, Iter: 22/93 -- label: 0.0000 loss: 0.1139
2026-04-07 23:44:40,992 - INFO - Epoch: 3/5, Iter: 23/93 -- label: 2.0000 loss: 0.1289
2026-04-07 23:44:41,108 - INFO - Epoch: 3/5, Iter: 24/93 -- label: 1.0000 loss: 0.1150
2026-04-07 23:44:41,224 - INFO - Epoch: 3/5, Iter: 25/93 -- label: 2.0000 loss: 0.1163
2026-04-07 23:44:41,348 - INFO - Epoch: 3/5, Iter: 26/93 -- label: 2.0000 loss: 0.1282
2026-04-07 23:44:41,474 - INFO - Epoch: 3/5, Iter: 27/93 -- label: 5.0000 loss: 0.1308
2026-04-07 23:44:41,595 - INFO - Epoch: 3/5, Iter: 28/93 -- label: 4.0000 loss: 0.1113
2026-04-07 23:44:41,716 - INFO - Epoch: 3/5, Iter: 29/93 -- label: 0.0000 loss: 0.1031
2026-04-07 23:44:41,838 - INFO - Epoch: 3/5, Iter: 30/93 -- label: 1.0000 loss: 0.0997
2026-04-07 23:44:41,952 - INFO - Epoch: 3/5, Iter: 31/93 -- label: 3.0000 loss: 0.1002
2026-04-07 23:44:42,067 - INFO - Epoch: 3/5, Iter: 32/93 -- label: 2.0000 loss: 0.1381
2026-04-07 23:44:42,183 - INFO - Epoch: 3/5, Iter: 33/93 -- label: 5.0000 loss: 0.1158
2026-04-07 23:44:42,301 - INFO - Epoch: 3/5, Iter: 34/93 -- label: 3.0000 loss: 0.1241
2026-04-07 23:44:42,417 - INFO - Epoch: 3/5, Iter: 35/93 -- label: 2.0000 loss: 0.1098
2026-04-07 23:44:42,533 - INFO - Epoch: 3/5, Iter: 36/93 -- label: 4.0000 loss: 0.1257
2026-04-07 23:44:42,649 - INFO - Epoch: 3/5, Iter: 37/93 -- label: 5.0000 loss: 0.1124
2026-04-07 23:44:42,962 - INFO - Epoch: 3/5, Iter: 38/93 -- label: 3.0000 loss: 0.1017
2026-04-07 23:44:43,101 - INFO - Epoch: 3/5, Iter: 39/93 -- label: 0.0000 loss: 0.1198
2026-04-07 23:44:43,236 - INFO - Epoch: 3/5, Iter: 40/93 -- label: 3.0000 loss: 0.1055
2026-04-07 23:44:43,369 - INFO - Epoch: 3/5, Iter: 41/93 -- label: 3.0000 loss: 0.1055
2026-04-07 23:44:43,494 - INFO - Epoch: 3/5, Iter: 42/93 -- label: 4.0000 loss: 0.1007
2026-04-07 23:44:43,613 - INFO - Epoch: 3/5, Iter: 43/93 -- label: 3.0000 loss: 0.1020
2026-04-07 23:44:43,728 - INFO - Epoch: 3/5, Iter: 44/93 -- label: 0.0000 loss: 0.1133
2026-04-07 23:44:43,844 - INFO - Epoch: 3/5, Iter: 45/93 -- label: 4.0000 loss: 0.0981
2026-04-07 23:44:43,962 - INFO - Epoch: 3/5, Iter: 46/93 -- label: 0.0000 loss: 0.1071
2026-04-07 23:44:44,075 - INFO - Epoch: 3/5, Iter: 47/93 -- label: 3.0000 loss: 0.0998
2026-04-07 23:44:44,191 - INFO - Epoch: 3/5, Iter: 48/93 -- label: 5.0000 loss: 0.0992
2026-04-07 23:44:44,311 - INFO - Epoch: 3/5, Iter: 49/93 -- label: 3.0000 loss: 0.0941
2026-04-07 23:44:44,435 - INFO - Epoch: 3/5, Iter: 50/93 -- label: 1.0000 loss: 0.1060
2026-04-07 23:44:44,549 - INFO - Epoch: 3/5, Iter: 51/93 -- label: 4.0000 loss: 0.1170
2026-04-07 23:44:44,666 - INFO - Epoch: 3/5, Iter: 52/93 -- label: 5.0000 loss: 0.0945
2026-04-07 23:44:44,783 - INFO - Epoch: 3/5, Iter: 53/93 -- label: 4.0000 loss: 0.0949
2026-04-07 23:44:44,901 - INFO - Epoch: 3/5, Iter: 54/93 -- label: 2.0000 loss: 0.1078
2026-04-07 23:44:45,015 - INFO - Epoch: 3/5, Iter: 55/93 -- label: 5.0000 loss: 0.0992
2026-04-07 23:44:45,136 - INFO - Epoch: 3/5, Iter: 56/93 -- label: 4.0000 loss: 0.0957
2026-04-07 23:44:45,257 - INFO - Epoch: 3/5, Iter: 57/93 -- label: 3.0000 loss: 0.1097
2026-04-07 23:44:45,376 - INFO - Epoch: 3/5, Iter: 58/93 -- label: 4.0000 loss: 0.1037
2026-04-07 23:44:45,490 - INFO - Epoch: 3/5, Iter: 59/93 -- label: 2.0000 loss: 0.1007
2026-04-07 23:44:45,607 - INFO - Epoch: 3/5, Iter: 60/93 -- label: 4.0000 loss: 0.0841
2026-04-07 23:44:45,724 - INFO - Epoch: 3/5, Iter: 61/93 -- label: 1.0000 loss: 0.0915
2026-04-07 23:44:45,842 - INFO - Epoch: 3/5, Iter: 62/93 -- label: 1.0000 loss: 0.1103
2026-04-07 23:44:45,956 - INFO - Epoch: 3/5, Iter: 63/93 -- label: 3.0000 loss: 0.0826
2026-04-07 23:44:46,074 - INFO - Epoch: 3/5, Iter: 64/93 -- label: 0.0000 loss: 0.0958
2026-04-07 23:44:46,190 - INFO - Epoch: 3/5, Iter: 65/93 -- label: 3.0000 loss: 0.0980
2026-04-07 23:44:46,310 - INFO - Epoch: 3/5, Iter: 66/93 -- label: 3.0000 loss: 0.0888
2026-04-07 23:44:46,425 - INFO - Epoch: 3/5, Iter: 67/93 -- label: 5.0000 loss: 0.0933
2026-04-07 23:44:46,542 - INFO - Epoch: 3/5, Iter: 68/93 -- label: 3.0000 loss: 0.0899
2026-04-07 23:44:46,658 - INFO - Epoch: 3/5, Iter: 69/93 -- label: 1.0000 loss: 0.0811
2026-04-07 23:44:46,780 - INFO - Epoch: 3/5, Iter: 70/93 -- label: 0.0000 loss: 0.0986
2026-04-07 23:44:46,901 - INFO - Epoch: 3/5, Iter: 71/93 -- label: 3.0000 loss: 0.0874
2026-04-07 23:44:47,018 - INFO - Epoch: 3/5, Iter: 72/93 -- label: 0.0000 loss: 0.0967
2026-04-07 23:44:47,133 - INFO - Epoch: 3/5, Iter: 73/93 -- label: 2.0000 loss: 0.1128
2026-04-07 23:44:47,253 - INFO - Epoch: 3/5, Iter: 74/93 -- label: 3.0000 loss: 0.1030
2026-04-07 23:44:47,370 - INFO - Epoch: 3/5, Iter: 75/93 -- label: 5.0000 loss: 0.0802
2026-04-07 23:44:47,669 - INFO - Epoch: 3/5, Iter: 76/93 -- label: 0.0000 loss: 0.0806
2026-04-07 23:44:47,796 - INFO - Epoch: 3/5, Iter: 77/93 -- label: 4.0000 loss: 0.1111
2026-04-07 23:44:47,918 - INFO - Epoch: 3/5, Iter: 78/93 -- label: 3.0000 loss: 0.0903
2026-04-07 23:44:48,053 - INFO - Epoch: 3/5, Iter: 79/93 -- label: 2.0000 loss: 0.0905
2026-04-07 23:44:48,171 - INFO - Epoch: 3/5, Iter: 80/93 -- label: 1.0000 loss: 0.0923
2026-04-07 23:44:48,289 - INFO - Epoch: 3/5, Iter: 81/93 -- label: 5.0000 loss: 0.0811
2026-04-07 23:44:48,409 - INFO - Epoch: 3/5, Iter: 82/93 -- label: 0.0000 loss: 0.0938
2026-04-07 23:44:48,530 - INFO - Epoch: 3/5, Iter: 83/93 -- label: 5.0000 loss: 0.0906
2026-04-07 23:44:48,648 - INFO - Epoch: 3/5, Iter: 84/93 -- label: 3.0000 loss: 0.0962
2026-04-07 23:44:48,774 - INFO - Epoch: 3/5, Iter: 85/93 -- label: 0.0000 loss: 0.0892
2026-04-07 23:44:48,894 - INFO - Epoch: 3/5, Iter: 86/93 -- label: 2.0000 loss: 0.0781
2026-04-07 23:44:49,012 - INFO - Epoch: 3/5, Iter: 87/93 -- label: 0.0000 loss: 0.0783
2026-04-07 23:44:49,131 - INFO - Epoch: 3/5, Iter: 88/93 -- label: 4.0000 loss: 0.0734
2026-04-07 23:44:49,250 - INFO - Epoch: 3/5, Iter: 89/93 -- label: 3.0000 loss: 0.0926
2026-04-07 23:44:49,371 - INFO - Epoch: 3/5, Iter: 90/93 -- label: 2.0000 loss: 0.0794
2026-04-07 23:44:49,493 - INFO - Epoch: 3/5, Iter: 91/93 -- label: 5.0000 loss: 0.0778
2026-04-07 23:44:49,611 - INFO - Epoch: 3/5, Iter: 92/93 -- label: 2.0000 loss: 0.0836
2026-04-07 23:44:49,670 - INFO - Epoch: 3/5, Iter: 93/93 -- label: 5.0000 loss: 0.0686
INFO:ignite.engine.engine.SupervisedTrainer:Epoch[3] Complete. Time taken: 00:00:12.031
2026-04-07 23:44:49,985 - INFO - Epoch: 4/5, Iter: 1/93 -- label: 4.0000 loss: 0.0873
2026-04-07 23:44:50,106 - INFO - Epoch: 4/5, Iter: 2/93 -- label: 5.0000 loss: 0.0749
2026-04-07 23:44:50,222 - INFO - Epoch: 4/5, Iter: 3/93 -- label: 1.0000 loss: 0.0712
2026-04-07 23:44:50,336 - INFO - Epoch: 4/5, Iter: 4/93 -- label: 3.0000 loss: 0.0820
2026-04-07 23:44:50,459 - INFO - Epoch: 4/5, Iter: 5/93 -- label: 5.0000 loss: 0.0811
2026-04-07 23:44:50,578 - INFO - Epoch: 4/5, Iter: 6/93 -- label: 3.0000 loss: 0.0985
2026-04-07 23:44:50,692 - INFO - Epoch: 4/5, Iter: 7/93 -- label: 2.0000 loss: 0.0756
2026-04-07 23:44:50,807 - INFO - Epoch: 4/5, Iter: 8/93 -- label: 5.0000 loss: 0.0843
2026-04-07 23:44:50,928 - INFO - Epoch: 4/5, Iter: 9/93 -- label: 4.0000 loss: 0.0868
2026-04-07 23:44:51,046 - INFO - Epoch: 4/5, Iter: 10/93 -- label: 1.0000 loss: 0.0748
2026-04-07 23:44:51,160 - INFO - Epoch: 4/5, Iter: 11/93 -- label: 3.0000 loss: 0.0738
2026-04-07 23:44:51,277 - INFO - Epoch: 4/5, Iter: 12/93 -- label: 5.0000 loss: 0.0850
2026-04-07 23:44:51,400 - INFO - Epoch: 4/5, Iter: 13/93 -- label: 4.0000 loss: 0.0734
2026-04-07 23:44:51,516 - INFO - Epoch: 4/5, Iter: 14/93 -- label: 5.0000 loss: 0.0734
2026-04-07 23:44:51,631 - INFO - Epoch: 4/5, Iter: 15/93 -- label: 1.0000 loss: 0.0734
2026-04-07 23:44:51,744 - INFO - Epoch: 4/5, Iter: 16/93 -- label: 0.0000 loss: 0.0692
2026-04-07 23:44:51,863 - INFO - Epoch: 4/5, Iter: 17/93 -- label: 2.0000 loss: 0.0787
2026-04-07 23:44:51,980 - INFO - Epoch: 4/5, Iter: 18/93 -- label: 3.0000 loss: 0.0649
2026-04-07 23:44:52,096 - INFO - Epoch: 4/5, Iter: 19/93 -- label: 1.0000 loss: 0.0763
2026-04-07 23:44:52,210 - INFO - Epoch: 4/5, Iter: 20/93 -- label: 0.0000 loss: 0.0672
2026-04-07 23:44:52,329 - INFO - Epoch: 4/5, Iter: 21/93 -- label: 2.0000 loss: 0.0744
2026-04-07 23:44:52,445 - INFO - Epoch: 4/5, Iter: 22/93 -- label: 0.0000 loss: 0.0738
2026-04-07 23:44:52,848 - INFO - Epoch: 4/5, Iter: 23/93 -- label: 2.0000 loss: 0.0681
2026-04-07 23:44:52,969 - INFO - Epoch: 4/5, Iter: 24/93 -- label: 5.0000 loss: 0.0790
2026-04-07 23:44:53,108 - INFO - Epoch: 4/5, Iter: 25/93 -- label: 0.0000 loss: 0.0685
2026-04-07 23:44:53,245 - INFO - Epoch: 4/5, Iter: 26/93 -- label: 5.0000 loss: 0.0776
2026-04-07 23:44:53,378 - INFO - Epoch: 4/5, Iter: 27/93 -- label: 5.0000 loss: 0.0826
2026-04-07 23:44:53,505 - INFO - Epoch: 4/5, Iter: 28/93 -- label: 3.0000 loss: 0.0680
2026-04-07 23:44:53,638 - INFO - Epoch: 4/5, Iter: 29/93 -- label: 4.0000 loss: 0.0613
2026-04-07 23:44:53,764 - INFO - Epoch: 4/5, Iter: 30/93 -- label: 4.0000 loss: 0.0686
2026-04-07 23:44:53,896 - INFO - Epoch: 4/5, Iter: 31/93 -- label: 0.0000 loss: 0.0756
2026-04-07 23:44:54,021 - INFO - Epoch: 4/5, Iter: 32/93 -- label: 2.0000 loss: 0.0709
2026-04-07 23:44:54,160 - INFO - Epoch: 4/5, Iter: 33/93 -- label: 1.0000 loss: 0.0637
2026-04-07 23:44:54,302 - INFO - Epoch: 4/5, Iter: 34/93 -- label: 0.0000 loss: 0.0665
2026-04-07 23:44:54,427 - INFO - Epoch: 4/5, Iter: 35/93 -- label: 5.0000 loss: 0.0736
2026-04-07 23:44:54,543 - INFO - Epoch: 4/5, Iter: 36/93 -- label: 2.0000 loss: 0.0568
2026-04-07 23:44:54,674 - INFO - Epoch: 4/5, Iter: 37/93 -- label: 5.0000 loss: 0.0652
2026-04-07 23:44:54,807 - INFO - Epoch: 4/5, Iter: 38/93 -- label: 1.0000 loss: 0.0632
2026-04-07 23:44:54,938 - INFO - Epoch: 4/5, Iter: 39/93 -- label: 4.0000 loss: 0.0561
2026-04-07 23:44:55,064 - INFO - Epoch: 4/5, Iter: 40/93 -- label: 4.0000 loss: 0.0973
2026-04-07 23:44:55,190 - INFO - Epoch: 4/5, Iter: 41/93 -- label: 3.0000 loss: 0.0627
2026-04-07 23:44:55,312 - INFO - Epoch: 4/5, Iter: 42/93 -- label: 3.0000 loss: 0.0512
2026-04-07 23:44:55,437 - INFO - Epoch: 4/5, Iter: 43/93 -- label: 4.0000 loss: 0.0557
2026-04-07 23:44:55,565 - INFO - Epoch: 4/5, Iter: 44/93 -- label: 4.0000 loss: 0.0614
2026-04-07 23:44:55,704 - INFO - Epoch: 4/5, Iter: 45/93 -- label: 5.0000 loss: 0.0680
2026-04-07 23:44:55,832 - INFO - Epoch: 4/5, Iter: 46/93 -- label: 1.0000 loss: 0.0701
2026-04-07 23:44:55,961 - INFO - Epoch: 4/5, Iter: 47/93 -- label: 3.0000 loss: 0.0481
2026-04-07 23:44:56,097 - INFO - Epoch: 4/5, Iter: 48/93 -- label: 3.0000 loss: 0.0583
2026-04-07 23:44:56,220 - INFO - Epoch: 4/5, Iter: 49/93 -- label: 3.0000 loss: 0.0595
2026-04-07 23:44:56,350 - INFO - Epoch: 4/5, Iter: 50/93 -- label: 0.0000 loss: 0.0760
2026-04-07 23:44:56,477 - INFO - Epoch: 4/5, Iter: 51/93 -- label: 2.0000 loss: 0.0600
2026-04-07 23:44:56,596 - INFO - Epoch: 4/5, Iter: 52/93 -- label: 4.0000 loss: 0.0719
2026-04-07 23:44:56,731 - INFO - Epoch: 4/5, Iter: 53/93 -- label: 5.0000 loss: 0.0507
2026-04-07 23:44:56,865 - INFO - Epoch: 4/5, Iter: 54/93 -- label: 1.0000 loss: 0.0681
2026-04-07 23:44:56,988 - INFO - Epoch: 4/5, Iter: 55/93 -- label: 3.0000 loss: 0.0504
2026-04-07 23:44:57,120 - INFO - Epoch: 4/5, Iter: 56/93 -- label: 2.0000 loss: 0.0581
2026-04-07 23:44:57,260 - INFO - Epoch: 4/5, Iter: 57/93 -- label: 3.0000 loss: 0.0534
2026-04-07 23:44:57,394 - INFO - Epoch: 4/5, Iter: 58/93 -- label: 1.0000 loss: 0.0702
2026-04-07 23:44:57,699 - INFO - Epoch: 4/5, Iter: 59/93 -- label: 4.0000 loss: 0.0559
2026-04-07 23:44:57,824 - INFO - Epoch: 4/5, Iter: 60/93 -- label: 0.0000 loss: 0.0698
2026-04-07 23:44:57,963 - INFO - Epoch: 4/5, Iter: 61/93 -- label: 5.0000 loss: 0.0520
2026-04-07 23:44:58,087 - INFO - Epoch: 4/5, Iter: 62/93 -- label: 5.0000 loss: 0.0603
2026-04-07 23:44:58,218 - INFO - Epoch: 4/5, Iter: 63/93 -- label: 5.0000 loss: 0.0509
2026-04-07 23:44:58,337 - INFO - Epoch: 4/5, Iter: 64/93 -- label: 5.0000 loss: 0.0516
2026-04-07 23:44:58,478 - INFO - Epoch: 4/5, Iter: 65/93 -- label: 5.0000 loss: 0.0675
2026-04-07 23:44:58,616 - INFO - Epoch: 4/5, Iter: 66/93 -- label: 5.0000 loss: 0.0613
2026-04-07 23:44:58,736 - INFO - Epoch: 4/5, Iter: 67/93 -- label: 2.0000 loss: 0.0518
2026-04-07 23:44:58,866 - INFO - Epoch: 4/5, Iter: 68/93 -- label: 1.0000 loss: 0.0712
2026-04-07 23:44:59,010 - INFO - Epoch: 4/5, Iter: 69/93 -- label: 0.0000 loss: 0.0542
2026-04-07 23:44:59,142 - INFO - Epoch: 4/5, Iter: 70/93 -- label: 2.0000 loss: 0.0615
2026-04-07 23:44:59,273 - INFO - Epoch: 4/5, Iter: 71/93 -- label: 2.0000 loss: 0.0482
2026-04-07 23:44:59,394 - INFO - Epoch: 4/5, Iter: 72/93 -- label: 4.0000 loss: 0.0637
2026-04-07 23:44:59,529 - INFO - Epoch: 4/5, Iter: 73/93 -- label: 5.0000 loss: 0.0631
2026-04-07 23:44:59,661 - INFO - Epoch: 4/5, Iter: 74/93 -- label: 1.0000 loss: 0.0559
2026-04-07 23:44:59,792 - INFO - Epoch: 4/5, Iter: 75/93 -- label: 2.0000 loss: 0.0513
2026-04-07 23:44:59,922 - INFO - Epoch: 4/5, Iter: 76/93 -- label: 5.0000 loss: 0.0673
2026-04-07 23:45:00,054 - INFO - Epoch: 4/5, Iter: 77/93 -- label: 4.0000 loss: 0.0629
2026-04-07 23:45:00,182 - INFO - Epoch: 4/5, Iter: 78/93 -- label: 2.0000 loss: 0.0504
2026-04-07 23:45:00,307 - INFO - Epoch: 4/5, Iter: 79/93 -- label: 4.0000 loss: 0.0603
2026-04-07 23:45:00,431 - INFO - Epoch: 4/5, Iter: 80/93 -- label: 1.0000 loss: 0.0440
2026-04-07 23:45:00,554 - INFO - Epoch: 4/5, Iter: 81/93 -- label: 0.0000 loss: 0.0565
2026-04-07 23:45:00,684 - INFO - Epoch: 4/5, Iter: 82/93 -- label: 0.0000 loss: 0.0631
2026-04-07 23:45:00,813 - INFO - Epoch: 4/5, Iter: 83/93 -- label: 2.0000 loss: 0.0627
2026-04-07 23:45:00,937 - INFO - Epoch: 4/5, Iter: 84/93 -- label: 0.0000 loss: 0.0504
2026-04-07 23:45:01,080 - INFO - Epoch: 4/5, Iter: 85/93 -- label: 2.0000 loss: 0.0492
2026-04-07 23:45:01,200 - INFO - Epoch: 4/5, Iter: 86/93 -- label: 3.0000 loss: 0.0594
2026-04-07 23:45:01,336 - INFO - Epoch: 4/5, Iter: 87/93 -- label: 5.0000 loss: 0.0615
2026-04-07 23:45:01,473 - INFO - Epoch: 4/5, Iter: 88/93 -- label: 2.0000 loss: 0.0490
2026-04-07 23:45:01,597 - INFO - Epoch: 4/5, Iter: 89/93 -- label: 0.0000 loss: 0.0692
2026-04-07 23:45:01,731 - INFO - Epoch: 4/5, Iter: 90/93 -- label: 5.0000 loss: 0.0498
2026-04-07 23:45:01,858 - INFO - Epoch: 4/5, Iter: 91/93 -- label: 2.0000 loss: 0.0560
2026-04-07 23:45:01,984 - INFO - Epoch: 4/5, Iter: 92/93 -- label: 0.0000 loss: 0.0403
2026-04-07 23:45:02,051 - INFO - Epoch: 4/5, Iter: 93/93 -- label: 1.0000 loss: 0.0605
INFO:ignite.engine.engine.SupervisedTrainer:Epoch[4] Complete. Time taken: 00:00:12.381
2026-04-07 23:45:02,354 - INFO - Epoch: 5/5, Iter: 1/93 -- label: 0.0000 loss: 0.0574
2026-04-07 23:45:02,490 - INFO - Epoch: 5/5, Iter: 2/93 -- label: 0.0000 loss: 0.0637
2026-04-07 23:45:02,624 - INFO - Epoch: 5/5, Iter: 3/93 -- label: 4.0000 loss: 0.0443
2026-04-07 23:45:02,757 - INFO - Epoch: 5/5, Iter: 4/93 -- label: 5.0000 loss: 0.0421
2026-04-07 23:45:03,140 - INFO - Epoch: 5/5, Iter: 5/93 -- label: 0.0000 loss: 0.0573
2026-04-07 23:45:03,269 - INFO - Epoch: 5/5, Iter: 6/93 -- label: 5.0000 loss: 0.0384
2026-04-07 23:45:03,399 - INFO - Epoch: 5/5, Iter: 7/93 -- label: 5.0000 loss: 0.0551
2026-04-07 23:45:03,531 - INFO - Epoch: 5/5, Iter: 8/93 -- label: 3.0000 loss: 0.0606
2026-04-07 23:45:03,658 - INFO - Epoch: 5/5, Iter: 9/93 -- label: 5.0000 loss: 0.0428
2026-04-07 23:45:03,783 - INFO - Epoch: 5/5, Iter: 10/93 -- label: 3.0000 loss: 0.0499
2026-04-07 23:45:03,909 - INFO - Epoch: 5/5, Iter: 11/93 -- label: 4.0000 loss: 0.0433
2026-04-07 23:45:04,044 - INFO - Epoch: 5/5, Iter: 12/93 -- label: 3.0000 loss: 0.0528
2026-04-07 23:45:04,173 - INFO - Epoch: 5/5, Iter: 13/93 -- label: 3.0000 loss: 0.0442
2026-04-07 23:45:04,300 - INFO - Epoch: 5/5, Iter: 14/93 -- label: 3.0000 loss: 0.0548
2026-04-07 23:45:04,421 - INFO - Epoch: 5/5, Iter: 15/93 -- label: 4.0000 loss: 0.0627
2026-04-07 23:45:04,542 - INFO - Epoch: 5/5, Iter: 16/93 -- label: 3.0000 loss: 0.0500
2026-04-07 23:45:04,670 - INFO - Epoch: 5/5, Iter: 17/93 -- label: 5.0000 loss: 0.0548
2026-04-07 23:45:04,802 - INFO - Epoch: 5/5, Iter: 18/93 -- label: 4.0000 loss: 0.0474
2026-04-07 23:45:04,931 - INFO - Epoch: 5/5, Iter: 19/93 -- label: 2.0000 loss: 0.0533
2026-04-07 23:45:05,070 - INFO - Epoch: 5/5, Iter: 20/93 -- label: 0.0000 loss: 0.0485
2026-04-07 23:45:05,203 - INFO - Epoch: 5/5, Iter: 21/93 -- label: 1.0000 loss: 0.0433
2026-04-07 23:45:05,344 - INFO - Epoch: 5/5, Iter: 22/93 -- label: 1.0000 loss: 0.0477
2026-04-07 23:45:05,474 - INFO - Epoch: 5/5, Iter: 23/93 -- label: 3.0000 loss: 0.0462
2026-04-07 23:45:05,592 - INFO - Epoch: 5/5, Iter: 24/93 -- label: 4.0000 loss: 0.0556
2026-04-07 23:45:05,714 - INFO - Epoch: 5/5, Iter: 25/93 -- label: 2.0000 loss: 0.0494
2026-04-07 23:45:05,840 - INFO - Epoch: 5/5, Iter: 26/93 -- label: 1.0000 loss: 0.0490
2026-04-07 23:45:05,967 - INFO - Epoch: 5/5, Iter: 27/93 -- label: 3.0000 loss: 0.0429
2026-04-07 23:45:06,097 - INFO - Epoch: 5/5, Iter: 28/93 -- label: 1.0000 loss: 0.0458
2026-04-07 23:45:06,226 - INFO - Epoch: 5/5, Iter: 29/93 -- label: 3.0000 loss: 0.0545
2026-04-07 23:45:06,356 - INFO - Epoch: 5/5, Iter: 30/93 -- label: 3.0000 loss: 0.0436
2026-04-07 23:45:06,485 - INFO - Epoch: 5/5, Iter: 31/93 -- label: 2.0000 loss: 0.0461
2026-04-07 23:45:06,617 - INFO - Epoch: 5/5, Iter: 32/93 -- label: 4.0000 loss: 0.0426
2026-04-07 23:45:06,750 - INFO - Epoch: 5/5, Iter: 33/93 -- label: 4.0000 loss: 0.0369
2026-04-07 23:45:06,877 - INFO - Epoch: 5/5, Iter: 34/93 -- label: 4.0000 loss: 0.0342
2026-04-07 23:45:07,009 - INFO - Epoch: 5/5, Iter: 35/93 -- label: 4.0000 loss: 0.0389
2026-04-07 23:45:07,136 - INFO - Epoch: 5/5, Iter: 36/93 -- label: 5.0000 loss: 0.0572
2026-04-07 23:45:07,263 - INFO - Epoch: 5/5, Iter: 37/93 -- label: 1.0000 loss: 0.0416
2026-04-07 23:45:07,385 - INFO - Epoch: 5/5, Iter: 38/93 -- label: 0.0000 loss: 0.0315
2026-04-07 23:45:07,514 - INFO - Epoch: 5/5, Iter: 39/93 -- label: 3.0000 loss: 0.0569
2026-04-07 23:45:07,642 - INFO - Epoch: 5/5, Iter: 40/93 -- label: 4.0000 loss: 0.0366
2026-04-07 23:45:07,960 - INFO - Epoch: 5/5, Iter: 41/93 -- label: 3.0000 loss: 0.0450
2026-04-07 23:45:08,096 - INFO - Epoch: 5/5, Iter: 42/93 -- label: 0.0000 loss: 0.0523
2026-04-07 23:45:08,224 - INFO - Epoch: 5/5, Iter: 43/93 -- label: 0.0000 loss: 0.0441
2026-04-07 23:45:08,355 - INFO - Epoch: 5/5, Iter: 44/93 -- label: 5.0000 loss: 0.0386
2026-04-07 23:45:08,491 - INFO - Epoch: 5/5, Iter: 45/93 -- label: 3.0000 loss: 0.0517
2026-04-07 23:45:08,618 - INFO - Epoch: 5/5, Iter: 46/93 -- label: 2.0000 loss: 0.0330
2026-04-07 23:45:08,742 - INFO - Epoch: 5/5, Iter: 47/93 -- label: 5.0000 loss: 0.0454
2026-04-07 23:45:08,873 - INFO - Epoch: 5/5, Iter: 48/93 -- label: 5.0000 loss: 0.0487
2026-04-07 23:45:09,002 - INFO - Epoch: 5/5, Iter: 49/93 -- label: 0.0000 loss: 0.0446
2026-04-07 23:45:09,122 - INFO - Epoch: 5/5, Iter: 50/93 -- label: 5.0000 loss: 0.0330
2026-04-07 23:45:09,255 - INFO - Epoch: 5/5, Iter: 51/93 -- label: 2.0000 loss: 0.0625
2026-04-07 23:45:09,386 - INFO - Epoch: 5/5, Iter: 52/93 -- label: 2.0000 loss: 0.0492
2026-04-07 23:45:09,513 - INFO - Epoch: 5/5, Iter: 53/93 -- label: 3.0000 loss: 0.0518
2026-04-07 23:45:09,645 - INFO - Epoch: 5/5, Iter: 54/93 -- label: 0.0000 loss: 0.0468
2026-04-07 23:45:09,762 - INFO - Epoch: 5/5, Iter: 55/93 -- label: 1.0000 loss: 0.0298
2026-04-07 23:45:09,885 - INFO - Epoch: 5/5, Iter: 56/93 -- label: 3.0000 loss: 0.0406
2026-04-07 23:45:10,026 - INFO - Epoch: 5/5, Iter: 57/93 -- label: 4.0000 loss: 0.0523
2026-04-07 23:45:10,163 - INFO - Epoch: 5/5, Iter: 58/93 -- label: 1.0000 loss: 0.0473
2026-04-07 23:45:10,297 - INFO - Epoch: 5/5, Iter: 59/93 -- label: 5.0000 loss: 0.0432
2026-04-07 23:45:10,438 - INFO - Epoch: 5/5, Iter: 60/93 -- label: 1.0000 loss: 0.0589
2026-04-07 23:45:10,575 - INFO - Epoch: 5/5, Iter: 61/93 -- label: 1.0000 loss: 0.0541
2026-04-07 23:45:10,700 - INFO - Epoch: 5/5, Iter: 62/93 -- label: 5.0000 loss: 0.0465
2026-04-07 23:45:10,818 - INFO - Epoch: 5/5, Iter: 63/93 -- label: 4.0000 loss: 0.0403
2026-04-07 23:45:10,951 - INFO - Epoch: 5/5, Iter: 64/93 -- label: 0.0000 loss: 0.0504
2026-04-07 23:45:11,083 - INFO - Epoch: 5/5, Iter: 65/93 -- label: 3.0000 loss: 0.0380
2026-04-07 23:45:11,228 - INFO - Epoch: 5/5, Iter: 66/93 -- label: 5.0000 loss: 0.0303
2026-04-07 23:45:11,348 - INFO - Epoch: 5/5, Iter: 67/93 -- label: 5.0000 loss: 0.0513
2026-04-07 23:45:11,486 - INFO - Epoch: 5/5, Iter: 68/93 -- label: 2.0000 loss: 0.0434
2026-04-07 23:45:11,618 - INFO - Epoch: 5/5, Iter: 69/93 -- label: 3.0000 loss: 0.0416
2026-04-07 23:45:11,753 - INFO - Epoch: 5/5, Iter: 70/93 -- label: 0.0000 loss: 0.0380
2026-04-07 23:45:11,888 - INFO - Epoch: 5/5, Iter: 71/93 -- label: 3.0000 loss: 0.0388
2026-04-07 23:45:12,036 - INFO - Epoch: 5/5, Iter: 72/93 -- label: 5.0000 loss: 0.0466
2026-04-07 23:45:12,167 - INFO - Epoch: 5/5, Iter: 73/93 -- label: 5.0000 loss: 0.0345
2026-04-07 23:45:12,296 - INFO - Epoch: 5/5, Iter: 74/93 -- label: 3.0000 loss: 0.0413
2026-04-07 23:45:12,424 - INFO - Epoch: 5/5, Iter: 75/93 -- label: 4.0000 loss: 0.0439
2026-04-07 23:45:12,559 - INFO - Epoch: 5/5, Iter: 76/93 -- label: 3.0000 loss: 0.0389
2026-04-07 23:45:12,700 - INFO - Epoch: 5/5, Iter: 77/93 -- label: 5.0000 loss: 0.0291
2026-04-07 23:45:12,841 - INFO - Epoch: 5/5, Iter: 78/93 -- label: 5.0000 loss: 0.0424
2026-04-07 23:45:13,143 - INFO - Epoch: 5/5, Iter: 79/93 -- label: 0.0000 loss: 0.0303
2026-04-07 23:45:13,285 - INFO - Epoch: 5/5, Iter: 80/93 -- label: 5.0000 loss: 0.0319
2026-04-07 23:45:13,420 - INFO - Epoch: 5/5, Iter: 81/93 -- label: 4.0000 loss: 0.0287
2026-04-07 23:45:13,551 - INFO - Epoch: 5/5, Iter: 82/93 -- label: 3.0000 loss: 0.0367
2026-04-07 23:45:13,673 - INFO - Epoch: 5/5, Iter: 83/93 -- label: 0.0000 loss: 0.0437
2026-04-07 23:45:13,804 - INFO - Epoch: 5/5, Iter: 84/93 -- label: 2.0000 loss: 0.0383
2026-04-07 23:45:13,952 - INFO - Epoch: 5/5, Iter: 85/93 -- label: 4.0000 loss: 0.0448
2026-04-07 23:45:14,075 - INFO - Epoch: 5/5, Iter: 86/93 -- label: 4.0000 loss: 0.0486
2026-04-07 23:45:14,213 - INFO - Epoch: 5/5, Iter: 87/93 -- label: 3.0000 loss: 0.0381
2026-04-07 23:45:14,333 - INFO - Epoch: 5/5, Iter: 88/93 -- label: 0.0000 loss: 0.0420
2026-04-07 23:45:14,453 - INFO - Epoch: 5/5, Iter: 89/93 -- label: 1.0000 loss: 0.0452
2026-04-07 23:45:14,577 - INFO - Epoch: 5/5, Iter: 90/93 -- label: 1.0000 loss: 0.0315
2026-04-07 23:45:14,708 - INFO - Epoch: 5/5, Iter: 91/93 -- label: 3.0000 loss: 0.0382
2026-04-07 23:45:14,841 - INFO - Epoch: 5/5, Iter: 92/93 -- label: 0.0000 loss: 0.0382
2026-04-07 23:45:14,931 - INFO - Epoch: 5/5, Iter: 93/93 -- label: 4.0000 loss: 0.0396
INFO:ignite.engine.engine.SupervisedTrainer:Epoch[5] Complete. Time taken: 00:00:12.880
INFO:ignite.engine.engine.SupervisedTrainer:Engine run finished. Time taken: 00:01:02.015
CPU times: user 56.4 s, sys: 6.15 s, total: 1min 2s
Wall time: 1min 2s
torch.cuda.max_memory_allocated() / 1024**25513.73681640625
Check the prediction on the test dataset
dataset_dir = Path(root_dir, "MedNIST")
class_names = sorted(f"{x.name}" for x in dataset_dir.iterdir() if x.is_dir())
testdata = MedNISTDataset(root_dir=root_dir, transform=transform, section="test", download=False, runtime_cache=True)
max_items_to_print = 10
with eval_mode(model):
for item in DataLoader(testdata, batch_size=1, num_workers=0):
prob = np.array(model(item["image"].to(device)).detach().to("cpu"))[0]
pred = class_names[prob.argmax()]
gt = item["class_name"][0]
print(f"Class prediction is {pred}. Ground-truth: {gt}")
max_items_to_print -= 1
if max_items_to_print == 0:
breakClass prediction is AbdomenCT. Ground-truth: AbdomenCT
Class prediction is BreastMRI. Ground-truth: BreastMRI
Class prediction is ChestCT. Ground-truth: ChestCT
Class prediction is CXR. Ground-truth: CXR
Class prediction is Hand. Ground-truth: Hand
Class prediction is HeadCT. Ground-truth: HeadCT
Class prediction is HeadCT. Ground-truth: HeadCT
Class prediction is CXR. Ground-truth: CXR
Class prediction is ChestCT. Ground-truth: ChestCT
Class prediction is BreastMRI. Ground-truth: BreastMRI
/tmp/ipykernel_10733/3379698368.py:8: DeprecationWarning: __array__ implementation doesn't accept a copy keyword, so passing copy=False failed. __array__ must implement 'dtype' and 'copy' keyword arguments. To learn more, see the migration guide https://numpy.org/devdocs/numpy_2_0_migration_guide.html#adapting-to-changes-in-the-copy-keyword
prob = np.array(model(item["image"].to(device)).detach().to("cpu"))[0]