Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Correcting the cost function formula for Counterfactual examples (Dandl et al.'s method) (S#9.3.1.2) #364

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions manuscript/06.1-example-based-counterfactual.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -177,13 +177,13 @@ Let us now have a look on another approach overcoming these issues.

Dandl et al. suggest to simultaneously minimize a four-objective loss:

$$L(x,x',y',X^{obs})=\big(o_1(\hat{f}(x'),y'),o_2(x, x'),o_3(x,x'),o_4(x',X^{obs})\big) $$
$$L(x,x',y',X^{obs})=\big(o_1(\hat{f}(x'),Y'),o_2(x, x'),o_3(x,x'),o_4(x',X^{obs})\big) $$

Each of the four objectives $o_1$ to $o_4$ corresponds to one of the four criteria mentioned above.
where $Y'$ is a set of desired outcomes. Each of the four objectives $o_1$ to $o_4$ corresponds to one of the four criteria mentioned above.
The first objective $o_1$ reflects that the prediction of our counterfactual x' should be as close as possible to our desired prediction y'.
We therefore want to minimize the distance between $\hat{f}(x')$ and y', here calculated by the Manhattan metric ($L_1$ norm):
We therefore want to minimize the distance between $\hat{f}(x')$ and a set of desired outcomes Y', here calculated by the Manhattan metric ($L_1$ norm):

$$o_1(\hat{f}(x'),y')=\begin{cases}0&\text{if $\hat{f}(x')\in{}y'$}\\\inf\limits_{y'\in y'}|\hat{f}(x')-y'|&\text{else}\end{cases}$$
$$o_1(\hat{f}(x'),y')=\begin{cases}0&\text{if $\hat{f}(x')\in{}Y'$}\\\inf\limits_{y'\in Y'}|\hat{f}(x')-y'|&\text{else}\end{cases}$$

The second objective $o_2$ reflects that our counterfactual should be as similar as possible to our instance $x$.
It quantifies the distance between x' and x
Expand Down