diff --git a/index.html b/index.html index cb6adad..fe00460 100644 --- a/index.html +++ b/index.html @@ -100,7 +100,7 @@

A Multimodal Automated Interpretability -
+

Understanding an AI system can take many forms. For instance, we might want to know when and how the system relies on sensitive or spurious features, identify systematic errors in its predictions, or learn how to modify the training data and model architecture to improve accuracy and robustness. Today, answering these types of questions often involves significant effort on the part of researchers: synthesizing the outcomes of different experiments that use a variety of tools.