You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
when I integrated this code, it gives me this error
labels = torch._C._nn.pad_sequence(
RuntimeError: The size of tensor a (1966) must match the size of tensor b (1952) at non-singleton dimension 2
if I set the batch size greater than 1 per gpu
The text was updated successfully, but these errors were encountered:
@vjagannath786 The error you're encountering, RuntimeError: The size of tensor a (1966) must match the size of tensor b (1952) at non-singleton dimension 2, is due to a mismatch in the lengths of the sequences when padding the tensors.
This typically happens when different examples have input sequences of varying lengths, and the padding logic does not align them correctly.
Suggested Fix
One way to fix this is to ensure that all sequences have the same length before concatenating and padding them. Here’s how you can modify your code:
Ensure Consistent Input Sequence Lengths:
Add padding to ensure that all input sequences have the same length.
Adjust Padding Logic:
Use padding to align sequences to the maximum length within each batch.
def call(self, examples):
IGNORE_INDEX = -100
all_input_ids = []
all_label_ids = []
all_pixel_values = []
all_image_sizes = []
for example in examples:
image = example['images'][0]
text_dict = example['texts'][0]
when I integrated this code, it gives me this error
labels = torch._C._nn.pad_sequence(
RuntimeError: The size of tensor a (1966) must match the size of tensor b (1952) at non-singleton dimension 2
if I set the batch size greater than 1 per gpu
The text was updated successfully, but these errors were encountered: