# Pytorch expects each tensor to be equal size

We Are Going To Discuss About Pytorch expects each tensor to be equal size. So lets Start this Python Article.

## Pytorch expects each tensor to be equal size

1. How to solve Pytorch expects each tensor to be equal size

As per PyTorch Docs about `torch.stack()` function, it needs the input tensors in the same shape to stack. I don't know how will you be using the `embedding_matrix` but either you can add padding to your tensors (which will be a list of zeros at the end till a certain user-defined length and is recommended if you will train with this stacked tensor, refer this tutorial) to make them equidimensional or you can simply use something like `torch.cat(data,dim=0)`.

2. Pytorch expects each tensor to be equal size

As per PyTorch Docs about `torch.stack()` function, it needs the input tensors in the same shape to stack. I don't know how will you be using the `embedding_matrix` but either you can add padding to your tensors (which will be a list of zeros at the end till a certain user-defined length and is recommended if you will train with this stacked tensor, refer this tutorial) to make them equidimensional or you can simply use something like `torch.cat(data,dim=0)`.

## Solution 1

As per PyTorch Docs about `torch.stack()` function, it needs the input tensors in the same shape to stack. I don’t know how will you be using the `embedding_matrix` but either you can add padding to your tensors (which will be a list of zeros at the end till a certain user-defined length and is recommended if you will train with this stacked tensor, refer this tutorial) to make them equidimensional or you can simply use something like `torch.cat(data,dim=0)`.

Original Author Satya Prakash Dash Of This Content

## Conclusion

So This is all About This Tutorial. Hope This Tutorial Helped You. Thank You.