Skip to content

Commit e6ef8af

Browse files
committed
LICENSE update, metric losses update
1 parent 7c16e95 commit e6ef8af

File tree

6 files changed

+18
-1
lines changed

6 files changed

+18
-1
lines changed

LICENSE

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
MIT License
22

3-
Copyright (c) 2019 Chris Choy ([email protected])
3+
Copyright (c) 2019 Chris Choy ([email protected]), Jaesik Park ([email protected])
44

55
Permission is hereby granted, free of charge, to any person obtaining a copy of
66
this software and associated documentation files (the "Software"), to deal in

README.md

+17
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,23 @@ CGF by Khoury et al. maps 3D oriented histograms to a low-dimensional feature sp
1919
Our work addressed a number of limitations in the prior work. First, all prior approaches extract a small 3D patch or a set of points and map it to a low-dimensional space. This not only limits the receptive field of the network but is also computationally inefficient since all intermediate representations are computed separately even for overlapping 3D regions. Second, using expensive low-level geometric signatures as input can slow down feature computation. Lastly, limiting feature extraction to a subset of interest points results in lower spatial resolution for subsequent matching stages and can thus reduce registration accuracy.
2020

2121

22+
### Fully Convolutional Metric Learning and Hardest Contrastive, Hardest Triplet Loss
23+
24+
Traditional metric learning assumes that the features are independent and identically distributed (i.i.d.) since a batch is constructed by random sampling. However, in fully-convolutional feature extraction, adjacent features are locally correlated. Thus, hard-negative mining could find features adjacent to anchors, and they are false negatives. Thus, filtering out the false negatives is crucial similar to how Universal Correspondence Network by Choy et al. used a distance threshold.
25+
26+
Also, the number of features used in the fully-convolutional setting is orders of magnitude larger than in standard metric learning algorithms. For instance, FCGF generates ~40k features for a pair of scans (this increases proportionally with the batch size) while a minibatch in traditional metric learning has around 1k features. Thus, it is not feasible to use all pairwise distances within a batch as in standard metric learning.
27+
28+
To speed up the fully-convolutional feature learning, we propose hardest contrastive loss and hardest triplet loss. Visually, these are simple variants that use the hardest negatives for both of points within a positive pair.
29+
30+
| Contrastive Loss | Triplet Loss | Hardest Contrastive | Hardest Triplet |
31+
|:------------------:|:------------------:|:-------------------:|:------------------:|
32+
| ![1](images/1.png) | ![2](images/2.png) | ![3](images/3.png) | ![4](images/4.png) |
33+
34+
*Sampling and negative-mining strategy for each method. Blue: positives, Red: Negatives. Traditional contrastive and triplet losses use random sampling. Our hardest-contrastive and hardest-triplet losses use the hardest negatives.*
35+
36+
Please refer to our [ICCV'19 paper](https://node1.chrischoy.org/data/publications/fcgf/fcgf.pdf) for more details.
37+
38+
2239
### Visualization of FCGF
2340

2441
We color-coded FCGF features for pairs of 3D scans that are 10m apart for KITTI and a 3DMatch benchmark pair for indoor scans. FCGF features are mapped to a scalar space using t-SNE and colorized with the Spectral color map.

images/1.png

12.2 KB
Loading

images/2.png

18.9 KB
Loading

images/3.png

13 KB
Loading

images/4.png

46.8 KB
Loading

0 commit comments

Comments
 (0)