Skip to content

human-analysis/causal-relations-between-representations

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

causal-relations-between-representations

@inproceedings{
  wang2022ncinet,
  title={Do learned representations respect causal relationships?},
  author={Lan Wang and Vishnu Naresh Boddeti},
  booktitle={Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2022}
}

Overview

image

NCINet is an approach for observational causal discovery from high-dimensional data. It is trained purely on synthetically generated representations and can be applied to real representations. It's also be applied to analyze the effect on the underlying causal relation between learned representations induced by various design choices in representation learning.

Dataset

image

We annotate each face image in CASIA-Webface with eight multi-label attributes: color of hair, visibility of eyes, type of eyewear, facial hair, whether the mouth is open, smiling or not, wearing a hat, visibility of forehead, and gender.

How to evalute NCInet (3Dshape)

Causal consistency on 6 causal pair graphs.

image

How to evalute NCInet (CASIA-Webface)

Causal consistency on 6 causal pair graphs.

image

Neural Causal Inference Net (Generalization Experiment on Synthetic Dataset)

Training and testing the Model

  1. cd ./NCINet, set the causal function idx and adversarial as Example in run.sh:

     python main.py --args args/NN.txt --idx=0 --w=1
    
  2. Run run.sh

  3. For other parameters and settings, check args/NN.txt and config.py.

  4. For visualization, run:

     tensorboard --logdir=runs
    

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published