weighted_labels - Error in loss calculation
Hi;
I tried to use iota2 + deep learning + weighted_labels.
I have the following error:
Traceback (most recent call last): File "/work/scratch/env/mouretf/.conda/envs/iota2_env/bin/task_launcher.py", line 11, in sys.exit(main()) File "/work/scratch/env/mouretf/.conda/envs/iota2_env/lib/python3.9/site-packages/iota2/task_launcher.py", line 73, in main task_launcher(args.dill_file) File "/work/scratch/env/mouretf/.conda/envs/iota2_env/lib/python3.9/site-packages/iota2/task_launcher.py", line 62, in task_launcher func(**f_kwargs) File "/work/scratch/env/mouretf/.conda/envs/iota2_env/lib/python3.9/site-packages/iota2/learning/launch_learning.py", line 168, in learn_torch_model train_pytorch_model.torch_learn( File "/work/scratch/env/mouretf/.conda/envs/iota2_env/lib/python3.9/site-packages/iota2/learning/pytorch/train_pytorch_model.py", line 862, in torch_learn loss = criterion(y_hat, targets, **criterion_kwargs) File "/work/scratch/env/mouretf/.conda/envs/iota2_env/lib/python3.9/site-packages/torch/nn/functional.py", line 3014, in cross_entropy return torch._C._nn.cross_entropy_loss(input, target, weight, _Reduction.get_enum(reduction), ignore_index, label_smoothing) RuntimeError: weight tensor should be defined either for all 10 classes or no classes but got weight tensor of shape: [8192, 1]
8192 is my batch size, so I assume that you are trying to give 1 weight for each sample. The problem is that the cross_entropy loss excpect to have one weight for each class (see here: https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html).
If you want to keep the formulation 1 weight / sample, the loss should be computed as follows (or something similar, not sure if you should divide by the mean of the weights) :
loss = torch.nn.functional.cross_entropy(output, target, reduction='none') loss = torch.mean(loss * weights)