You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to upsample lossy compressed point cloud data with PU-Net and I get four separate different-scales objects.
The data is 8i-people dataset with MPEG TMC13 lossy compressed. Pity it can not insert pictures.
I only modify 'dataset.py' to make it load 'PLY' files directly. Besides that, everything is in default configuration. I also put the output file 'camel.ply' and 'cow.ply' as input to double-upsample and the result is right, so it may not be this reason.
`class PUNET_Dataset_Whole(torch_data.Dataset):
def init(self, data_dir='./datas/test_data/our_collected_data/test_ply'):
super().init()
file_list = os.listdir(data_dir)
self.names = [x.split('.')[0] for x in file_list]
self.sample_path = [os.path.join(data_dir, x) for x in file_list]
def __len__(self):
return len(self.names)
def read_ply(self,filename):
plydata = PlyData.read(filename)
pc = plydata['vertex'].data
pc_array = np.array([[x, y, z] for x,y,z in pc])
return pc_array
def __getitem__(self, index):
#points = np.loadtxt(self.sample_path[index])
#print(self.sample_path[index])
points = self.read_ply(self.sample_path[index])
return points`
At first I think the reason is the data is too complicated so I sample 5k points in a small area and the result is still wrong. Right now I am really confused and your help is very needed.
The text was updated successfully, but these errors were encountered:
I am trying to upsample lossy compressed point cloud data with PU-Net and I get four separate different-scales objects.
The data is 8i-people dataset with MPEG TMC13 lossy compressed. Pity it can not insert pictures.
I only modify 'dataset.py' to make it load 'PLY' files directly. Besides that, everything is in default configuration. I also put the output file 'camel.ply' and 'cow.ply' as input to double-upsample and the result is right, so it may not be this reason.
`class PUNET_Dataset_Whole(torch_data.Dataset):
def init(self, data_dir='./datas/test_data/our_collected_data/test_ply'):
super().init()
file_list = os.listdir(data_dir)
self.names = [x.split('.')[0] for x in file_list]
self.sample_path = [os.path.join(data_dir, x) for x in file_list]
At first I think the reason is the data is too complicated so I sample 5k points in a small area and the result is still wrong. Right now I am really confused and your help is very needed.
The text was updated successfully, but these errors were encountered: