Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for tracked segmentation masks? #301

Open
sfmig opened this issue Sep 11, 2024 · 4 comments
Open

Add support for tracked segmentation masks? #301

sfmig opened this issue Sep 11, 2024 · 4 comments

Comments

@sfmig
Copy link
Contributor

sfmig commented Sep 11, 2024

With tools like SAM2 that track segmentation masks in videos, this may be a nice addition to our set of accepted input data.

Segmentation masks are usually represented as RLE (Run length encoding):

  • an example file with description (click Overview , then Submission file section)
  • a nice (video) explanation here -- see "Segmentation" (RLE - Run Length Encoding) bookmark

Maybe we can consider to link masks to bounding boxes too?

We may want to check Annolid - an annotation and tracking tool centred around instance segmentation

@sfmig sfmig converted this from a draft issue Sep 11, 2024
@niksirbi
Copy link
Member

niksirbi commented Nov 5, 2024

@gchindemi has also expressed strong interest in movement supporting this type of tracking data. Given the growing popularity of models like SAM, I think the appetite for this sort of thing will keep growing, which means we should probably implement this.

@gchindemi
Copy link

I will definitively keep an eye on this. We want to experiment with segmentation masks as input for social behavior analysis in LISBET, and having the functionality implemented in movement could greatly help.

@niksirbi niksirbi moved this from 🤔 Triage to 📝 Todo in movement progress tracker Nov 15, 2024
@SkepticRaven
Copy link

I don't know how many other groups have interacted with trying to store segmentation data for videos, but the data storage problem may end up being a bit more complex than using stuff like RLE. I ended up storing contours instead of masks.

If you're interested in having some test animal behavior data that already has segmentation predictions, you can look at our open field dataset here.
While we haven't yet released the code publicly for writing this segmentation data out just yet, we do have readers of the custom format I designed inside our JABS-behavior-classifier tool.

@niksirbi
Copy link
Member

Thanks a lot @SkepticRaven, this will be very useful for developing this feature in movement!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: 📝 Todo
Development

No branches or pull requests

4 participants