Influence Functions for Image Segmentation
Finding Helpful/Harmful training data for PSPNet
Debugging Machine Learning models is still a very important area in Machine Learning. One of the issues is that black-box models obviously don’t return a full stack trace of which components are going wrong. Even the long stack-traces of C++ would be preferable to the cloudiness of figuring out why a classifier isn’t working.
Influence Functions were very popular at ICML 2017. Pang and Koh et al. Demonstrated a method
First, we need to establish a few basics about how Segmentation models work.
Take Image Classifiers, for example. For a given problem, you may have a set of data and a set of labels. These may be fed through generators, or they may be in-memory arrays. In either case, you will have a matrix or tensor representing each instance of the image, and you will have some integer representing the label.
Segmentation adds another level of complexity to this. Instead of just one integer label for each image, we have integer labels for each part of the
PART 1: Get Influence Functions to work with PSPNet (image segmentation), non-pixel-wise (use Darkon Influence) PART 2: Get Influence Functions to work with PSPNet (image segmentation), pixel-wise (use Darkon Gradients) PART 3: Influence over entire test image for which part of it is causing the mischaracterization
Influence functions can be very useful. As we have seen, they are very useful for a wide variety of models. This goes beyond just classification and regression, but even goes to more complex versions of these such as segmentation.
There are still a few downsides to the use of influence functions, though. Example-based debugging for machine learning can be a very cumbersome task.
: ADE20K Dataset https://groups.csail.mit.edu/vision/datasets/ADE20K/ : Semantic Segmentation (PSPNet) with Tensorflow https://github.com/hellochick/semantic-segmentation-tensorflow : Mislabel detection using influence function with all of layers on Cifar-10, ResNet http://groups.csail.mit.edu/vision/datasets/ADE20K/ : Darkon Documentation http://darkon.io/example.html : Darkon Demo https://darkon-demo.herokuapp.com/influence : Influence function example for Cifar-10, ResNet https://nbviewer.jupyter.org/github/darkonhub/darkon-examples/blob/master/cifar10-resnet/influencecifar10resnet.ipynb : Mislabel detection using influence function with all of layers on Cifar-10, ResNet https://nbviewer.jupyter.org/github/darkonhub/darkon-examples/blob/master/cifar10-resnet/influencecifar10resnetmislabelall_layers.ipynb : Mislabel detection using influence function with top one layer on Cifar-10, ResNet https://nbviewer.jupyter.org/github/darkonhub/darkon-examples/blob/master/cifar10-resnet/influencecifar10resnetmislabelone_layer.ipynb : Gradcam example for ImageNet, ResNet50 (for the pixelwise part) https://nbviewer.jupyter.org/github/darkonhub/darkon-examples/blob/master/gradcam/GradcamDemo.ipynb : ICML 2017 Demo https://people.csail.mit.edu/beenkim/papers/BeenKFinaleDVICML2017_tutorial.pdf