You can run the interpret-community SDK to explain models locally without Azure. For notebooks on the local experience, please see: https://github.com/interpretml/interpret-community/tree/master/notebooks
Follow these sample notebooks to learn about the model interpretability integration with Azure:
- Explain on remote AMLCompute: Explain a model on a remote AMLCompute target.
- Explain tabular data with Run History: Explain a model with Run History.
- Operationalize model explanation: Operationalize model explanation as a web service.