Integrated-Gradient Optimized Saliency Maps (iGOS++/I-GOS)


iGOS++ and I-GOS are perturbation-based attribution maps that utilize integrated gradients. For a quick start on the code, please refer to the official GitHub repository.

Intended Use

iGOS++ assumes a black-box model for generating saliency maps and can generate explanations at arbitrary resolutions – low resolution masks generate a coarse explanation while high-resolution masks provide a fine-grained explanation over the input space. We believe iGOS++ can provide informative explanations to a wide range of users when a black-box classifier is available. This includes subject matter experts who want to utilize XAI to find some interesting features from deep networks, developers who want to debug their black-box model to unveil errors in their data, etc. As an example, we showcase the capabilities of iGOS++ in debugging a COVID-19 classifier on chest X-ray images.


In our experiments, commonly-used ImageNet classifiers such as VGG-19 and ResNet-50 are used. For the COVID-19 experiment on chest X-ray images, COVID-Net has been used.


Since our method integrates the gradient on the path from the original image to a baseline, we have noticed our method does not work well when the model’s prediction confidence is very low or when the classifier makes a wrong prediction.


  title={Visualizing Deep Networks by Optimizing with Integrated Gradients},
  author={Qi, Zhongang and Khorram, Saeed and Fuxin, Li},
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
author = {Khorram, Saeed and Lawson, Tyler and Fuxin, Li},
title = {IGOS++: Integrated Gradient Optimized Saliency by Bilateral Perturbations},
year = {2021},
isbn = {9781450383592},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {},
doi = {10.1145/3450439.3451865},
booktitle = {Proceedings of the Conference on Health, Inference, and Learning},
pages = {174–182},
keywords = {COVID-19, chest X-ray, saliency methods, medical imaging, integrated-gradient, explainable AI},
series = {CHIL '21}