Skip to content

Official implementation of the paper "Guided Distillation for Semi-Supervised Instance Segmentation".

License

Notifications You must be signed in to change notification settings

anirudh6415/GuidedDistillation

 
 

Repository files navigation

Mask DINO + Guided Distillation for Semi-Supervised Instance Segmentation

Overview

This repository merges the functionalities of Mask DINO and Guided Distillation for Semi-Supervised Instance Segmentation (based on Mask2Former) into a unified codebase. The combined approach aims to leverage the strengths of both methods for improved performance in object detection and segmentation tasks.

Projects

  1. Mask DINO: Mask DINO is a unified transformer-based framework for object detection and segmentation. It achieves state-of-the-art results across various segmentation tasks, including instance, panoptic, and semantic segmentation.

  2. Guided Distillation for Semi-Supervised Instance Segmentation: This approach extends Mask2Former for semi-supervised instance segmentation by utilizing guided distillation techniques to improve performance with limited labeled data.

Features

  • Unified architecture for object detection, panoptic, instance, and semantic segmentation.
  • Enhanced performance by combining state-of-the-art techniques from Mask DINO and guided distillation methods.
  • Support for major datasets: COCO, ADE20K, Cityscapes.

Installation

Follow the installation instructions to set up the environment and dependencies.

Getting Started

Please follow Mask DINO and Guided Distillation Repositories for installation.

  1. train_net.py for mask2former training.
  2. train_net_org.py and/or train_net1 for maskdino training.

Results

Mask DINO and Guided Distillation combined achieve state-of-the-art results on various benchmarks. Refer to the Mask DINO Results and Guided Distillation Results for detailed performance metrics.

If you find our work helpful for your research, please consider citing the following BibTeX entry.

Citations

If you use this repository in your research, please cite the original papers:

@article{li2022mask,
  title={Mask DINO: Towards A Unified Transformer-based Framework for Object Detection and Segmentation},
  author={Feng Li and Hao Zhang and Huaizhe Xu and Shilong Liu and Lei Zhang and Lionel M. Ni and Heung-Yeung Shum},
  journal={arXiv preprint arXiv:2206.02777},
  year={2022}
}

@article{author2023guided,
  title={Guided Distillation for Semi-Supervised Instance Segmentation},
  author={Author A and Author B and Author C},
  journal={arXiv preprint arXiv:2308.02668},
  year={2023}
}

@misc{zhang2022dino,
      title={DINO: DETR with Improved DeNoising Anchor Boxes for End-to-End Object Detection}, 
      author={Hao Zhang and Feng Li and Shilong Liu and Lei Zhang and Hang Su and Jun Zhu and Lionel M. Ni and Heung-Yeung Shum},
      year={2022},
      eprint={2203.03605},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

@inproceedings{li2022dn,
      title={Dn-detr: Accelerate detr training by introducing query denoising},
      author={Li, Feng and Zhang, Hao and Liu, Shilong and Guo, Jian and Ni, Lionel M and Zhang, Lei},
      booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
      pages={13619--13627},
      year={2022}
}

@inproceedings{
      liu2022dabdetr,
      title={{DAB}-{DETR}: Dynamic Anchor Boxes are Better Queries for {DETR}},
      author={Shilong Liu and Feng Li and Hao Zhang and Xiao Yang and Xianbiao Qi and Hang Su and Jun Zhu and Lei Zhang},
      booktitle={International Conference on Learning Representations},
      year={2022},
      url={https://openreview.net/forum?id=oMI9PjOb9Jl}
}

Acknowledgement

Many thanks to these excellent opensource projects

About

Official implementation of the paper "Guided Distillation for Semi-Supervised Instance Segmentation".

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 89.7%
  • Cuda 9.2%
  • Other 1.1%