Exported analysis file node locations are inconsistent with GUI? #890
-
Hi all, Sorry to bother again so soon, but I believe I'm hitting another issue during analysis. Here's what I've done:
Here's where I hit a problem. For this particular video/project, there are 18,000 frames, 7 nodes, and just 1 track (rat), hence Below is an example of the code I'm working with. I don't think it's doing anything out of the ordinary relative to the docs, so I'm a bit confused as to where things might be going wrong. It's loading the import json
from pathlib import Path
from typing import List
import h5py
import matplotlib.pyplot as plt
import numpy as np
# This is where we get the path to the exported `.h5` file.
all_frames_analysis_file: Path = Path(...)
# Isolate the datasets available within our H5 analysis file.
with h5py.File(str(all_frames_analysis_file), "r") as h5_file:
dset_names: List[str] = sorted(h5_file.keys())
node_names: List[str] = [node_name.decode().lower().replace(" ", "_") for node_name in h5_file["node_names"][:]]
locations: np.ndarray = h5_file["tracks"][:].T
track_names: List[str] = [track_name.decode() for track_name in h5_file["track_names"][:]]
# Print the available data keys within the analysis file.
print(f"Analysis file [{all_frames_analysis_file}] has datasets: {json.dumps(dset_names, indent=4)}")
# Analysis file [...] has datasets: [
# "instance_scores",
# "node_names",
# "point_scores",
# "track_names",
# "track_occupancy",
# "tracking_scores",
# "tracks"
# ]
# Print useful analysis descriptions.
frame_count, node_count, _, instance_count = locations.shape
print(f"Locations data shape: {locations.shape}")
print(f"Number of video frames: {frame_count:,}")
print(f"Number of skeleton nodes: {node_count:,}")
print(f"Number of tracks: {instance_count:,}")
print(f"Node names (after processing): {node_names}")
print(f"Tracks: {dict(enumerate(track_names))}")
# Locations data shape: (18000, 7, 2, 1)
# Number of video frames: 18,000
# Number of skeleton nodes: 7
# Number of tracks: 1
# Node names (after processing): ['nose', 'left_ear', 'right_ear', 'center', 'base_of_tail', 'middle_of_tail', 'tip_of_tail']
# Tracks: {0: 'track_0'}
# In this video, we only have one rat/track visible: `track_0`.
# So get frame 7 (index 6), all nodes (:), all coordinates (:), and `track_0` (index 0).
# Note: two of its nodes are missing/occluded, but that's fine.
locations_at_frame: np.ndarray = locations[6, :, :, 0]
locations_at_frame
# array([[1048.91052246, 520.03918457],
# [1072.59423828, 488.26348877],
# [1024.64941406, 491.54833984],
# [1052.79345703, 387.93057251],
# [ nan, nan],
# [ nan, nan],
# [ 920.64355469, 300.52679443]])
# Plot the points.
fig, ax = plt.subplots()
ax.scatter(x=locations_at_frame[:, 0], y=locations_at_frame[:, 1], c="coral")
# Add the node names as annotations.
for node_name, node_xy in zip(node_names, locations_at_frame):
node_x, node_y = node_xy
ax.annotate(node_name, (node_x, node_y)) The final scatter plot for the track using But if you look at the same frame within the SLEAP GUI, the nodes appear misplaced. Specifically, they look reflected over the x-axis (i.e., the right ear is positionally to the left of the nose): I've tried looking through other discussions, but all I could find were discussions on model predictions not resizing to different videos appropriately. On my local machine I'm using the |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
Hi @dataframing, The coordinates are returned as Qt (the library we use for the GUI) returns them with the (0, 0) coordinate being in the upper left-hand corner. As you observed, Let us know if that helps! Thanks, |
Beta Was this translation helpful? Give feedback.
Hi @dataframing,
The coordinates are returned as Qt (the library we use for the GUI) returns them with the (0, 0) coordinate being in the upper left-hand corner. As you observed,
matplotlib
is plotting the coordinates with (0, 0) being in the lower left-hand corner. You can invert the y-axis ofmatplotlib.pyplot
for the visuals to appear as seen in the video.Let us know if that helps!
Thanks,
Liezl