-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Populating a new chunk from scratch results in an ArrayIndexOutOfBoundsException #53
Comments
|
So if set it to the max number of blocks, will it reduce when I run the cleanup or will I need to do that as well? I suppose I could also do preprocessing, but I was doing it in a streaming manner instead for performance. |
You could set the initial length of the CompoundTag sectionData = new CompoundTag();
long[] blockStates = new long[4096];
ListTag<CompoundTag> palette = new ListTag<>(CompoundTag.class);
for (int index = 0; index < 4096; index++) {
// calculate paletteIndex here
blockStates[index] = paletteIndex; // setting palette index manually
}
sectionData.put("Palette", palette);
sectionData.put("BlockStates", blockStates);
Section section = new Section(sectionData, 2584);
section.cleanupPaletteAndBlockStates(); |
Thanks! Giving it a shot. Want me to close this? |
I am using this library to create an entirely new chunk copied from an existing chunk from the bottom up that I've loaded from elsewhere. I plan on making changes to the chunk as it copies in the future, but for now it's just a straight copy where I have to manually create the blockState for each x, y, and z. As it builds up the copied chunk, it seems to work fine until it hits a certain number of items in the palette. I get this exception:
I believe this is happening when it has to resize the blockdata palette index byte size because we've added one too many palette items. In Section on line 137 I see some logic that looks like it's meant to address the issue.
I'm not sure if there is an issue with me manually creating the blockState or something else causing this issue. Thanks for your assistance!
The text was updated successfully, but these errors were encountered: