Skip to content

Commit

Permalink
Merge pull request #25 from meyerls/dev
Browse files Browse the repository at this point in the history
Dev
  • Loading branch information
meyerls authored Apr 12, 2024
2 parents 37a723f + 838bd4d commit 665dcac
Show file tree
Hide file tree
Showing 14 changed files with 204 additions and 124 deletions.
17 changes: 6 additions & 11 deletions .github/workflows/python-package.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [ "3.7", "3.8", "3.9", "3.10" ]
python-version: ["3.8", "3.9", "3.10" ]

steps:
- uses: actions/checkout@v3
Expand All @@ -22,15 +22,10 @@ jobs:
pip install pip==21.3.1
pip install numpy --use-deprecated=legacy-resolver
pip install charset_normalizer
git clone https://github.com/meyerls/mistree.git
cd mistree
pip install -e .
cd ..
pip install -r requirements.txt
pip install .
#- name: 'Test PC Skeletor'
# run: |
# pytest .
# ls -la
# ls -la ${{ github.workspace }}
# pwd
- name: 'Test PC Skeletor'
run: |
pwd
ls -la
pytest tests
4 changes: 2 additions & 2 deletions .github/workflows/python-publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,10 +8,10 @@ jobs:
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@master
- name: Set up Python 3.7
- name: Set up Python 3.8
uses: actions/setup-python@v1
with:
python-version: 3.7
python-version: 3.8
- name: Install pypa/build
run: python -m pip install build --user
- name: Build a binary wheel and a source tarball
Expand Down
8 changes: 6 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -11,9 +11,13 @@ pc_skeletor.egg-info
pc_skeletor/__pycache__/**
pc_skeletor/tmp*

# Output
pc_skeletor/output*
output

__pycache__
test/__pycache__

test/__pycache__/**
__pycache__/**
__pycache__/**

annotated_tree.py
3 changes: 3 additions & 0 deletions .gitmodules
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[submodule "mistree"]
path = third_party/mistree
url = https://github.com/meyerls/mistree.git
18 changes: 10 additions & 8 deletions Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,17 +5,17 @@
<a href="https://github.com/meyerls/pc-skeletor/blob/main/LICENSE"><img alt="license" src="https://img.shields.io/github/license/meyerls/pc-skeletor"></a>
<!--a href="https://github.com/meyerls/pc-skeletor/actions"><img alt="GitHub Workflow Status" src="https://img.shields.io/github/workflow/status/meyerls/pc-skeletor/Python%20package"></a-->

**PC Skeletor** is a Python library for extracting a 1d skeleton from 3d point clouds using
**PC Skeletor** is a Python library for extracting a curved skeleton from 3d point clouds using
[Laplacian-Based Contraction](https://taiya.github.io/pubs/cao2010cloudcontr.pdf) and
[Semantic Laplacian-Based Contraction](https://arxiv.org/abs/2304.04708).

## Abstract
Standard Laplacian-based contraction (LBC) is prone to mal-contraction in cases where
Basic Laplacian-based contraction (LBC) is prone to mal-contraction in cases where
there is a significant disparity in diameter between trunk and branches. In such cases fine structures experience
an over-contraction and leading to a distortion of their topological characteristics. In addition, LBC shows a
topologically incorrect tree skeleton for trunk structures that have holes in the point cloud.In order to address
these topological artifacts, we introduce semantic Laplacian-based contraction (S-LBC). It integrates semantic
information of the point cloud into the contraction algorithm.
information of the point cloud into the contraction algorithm to overcome these artifacts.


<table>
Expand All @@ -35,7 +35,7 @@ information of the point cloud into the contraction algorithm.

### Installation

First install [Python](https://www.python.org/downloads/) Version 3.7 or higher. The python package can be installed
First install [Python](https://www.python.org/downloads/) Version 3.8 or higher. The python package can be installed
via [PyPi](https://pypi.org/project/pc-skeletor/) using pip.

````sh
Expand Down Expand Up @@ -83,10 +83,10 @@ lbc = LBC(point_cloud=pcd,
down_sample=0.008)
lbc.extract_skeleton()
lbc.extract_topology()

# Debug/Visualization
lbc.visualize()
lbc.show_graph(lbc.skeleton_graph)
lbc.show_graph(lbc.topology_graph)
lbc.save('./output')
lbc.export_results('./output')
lbc.animate(init_rot=np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0]]),
steps=300,
output='./output')
Expand All @@ -103,10 +103,12 @@ s_lbc = SLBC(point_cloud={'trunk': pcd_trunk, 'branches': pcd_branch},
debug=True)
s_lbc.extract_skeleton()
s_lbc.extract_topology()

# Debug/Visualization
s_lbc.visualize()
s_lbc.show_graph(s_lbc.skeleton_graph)
s_lbc.show_graph(s_lbc.topology_graph)
s_lbc.save('./output')
s_lbc.export_results('./output')
s_lbc.animate(init_rot=np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0]]), steps=300, output='./output')
````

Expand Down
16 changes: 8 additions & 8 deletions example_tree.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,9 +22,9 @@
lbc.visualize()
lbc.show_graph(lbc.skeleton_graph, fig_size=(30, 30))
lbc.show_graph(lbc.topology_graph)
lbc.save('./output')
# lbc.animate(init_rot=np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0]]), steps=500, output='./output')
# lbc.animate_contracted_pcd(init_rot=np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0]]), steps=300, output='./output')
lbc.export_results('./output')
lbc.animate(init_rot=np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0]]), steps=500, output='./output_lbc')
#lbc.animate_contracted_pcd(init_rot=np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0]]), steps=300, output='./output')
lbc.animate_topology(init_rot=np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0]]), steps=300, output='./output')

# Semantic Laplacian-based Contraction
Expand All @@ -36,9 +36,9 @@
s_lbc.extract_skeleton()
s_lbc.extract_topology()
s_lbc.visualize()
s_lbc.show_graph(lbc.skeleton_graph, fig_size=(30, 30))
s_lbc.show_graph(lbc.topology_graph)
s_lbc.save('./output')
#s_lbc.animate(init_rot=np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0]]), steps=500, output='./output')
s_lbc.show_graph(s_lbc.skeleton_graph, fig_size=(30, 30))
s_lbc.show_graph(s_lbc.topology_graph)
s_lbc.export_results('./output_slbc')
s_lbc.animate(init_rot=np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0]]), steps=500, output='./output')
#s_lbc.animate_contracted_pcd(init_rot=np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0]]), steps=300, output='./output')
s_lbc.animate_topology(init_rot=np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0]]), steps=300, output='./output')
s_lbc.animate_topology(init_rot=np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0]]), steps=300, output='./output_slbc')
16 changes: 15 additions & 1 deletion pc_skeletor/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,12 +41,26 @@ def __init__(self, verbose: bool = False, debug: bool = False):
self.topology_graph: nx.Graph = nx.Graph()

def extract_skeleton(self):
'''
Extract skeleton from point cloud
:return:
'''
pass

def extract_topology(self):
'''
Extract topology from point cloud
:return:
'''
pass

def save(self, *args):
def process(self):
self.extract_topology()
self.extract_topology()

def export_results(self, *args):
pass

def show_graph(self, graph: networkx.Graph, pos: Union[np.ndarray, bool] = True, fig_size: tuple = (20, 20)):
Expand Down
94 changes: 63 additions & 31 deletions pc_skeletor/laplacian.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
import open3d.visualization as o3d
import robust_laplacian
import mistree as mist
import networkx as nx

# Own modules
from pc_skeletor.download import *
Expand All @@ -29,8 +30,8 @@ def __init__(self,
algo_type: str,
point_cloud: Union[str, open3d.geometry.PointCloud, dict],
init_contraction: int,
init_attraction: int,
max_contraction: int,
init_attraction: float,
max_contraction: float,
max_attraction: int,
step_wise_contraction_amplification: Union[float, str],
termination_ratio: float,
Expand Down Expand Up @@ -58,16 +59,16 @@ def __init__(self,

# Set or load point cloud to apply algorithm
if isinstance(point_cloud, str):
self.pcd: o3d.geometry.PointCloud = load_pcd(filename=point_cloud, normalize=False)
self.pcd: o3d.geometry.PointCloud = load_pcd(filename=point_cloud)
elif isinstance(point_cloud, dict):
# Currently only two classes are supported!
if isinstance(point_cloud['trunk'], str):
self.trunk: o3d.geometry.PointCloud = load_pcd(filename=point_cloud['trunk'], normalize=False)
self.trunk: o3d.geometry.PointCloud = load_pcd(filename=point_cloud['trunk'])
else:
self.trunk: o3d.geometry.PointCloud = point_cloud['trunk']

if isinstance(point_cloud['branches'], str):
self.branches: o3d.geometry.PointCloud = load_pcd(filename=point_cloud['branches'], normalize=False)
self.branches: o3d.geometry.PointCloud = load_pcd(filename=point_cloud['branches'])
else:
self.branches: o3d.geometry.PointCloud = point_cloud['branches']
elif isinstance(point_cloud, open3d.geometry.PointCloud):
Expand Down Expand Up @@ -181,7 +182,7 @@ def extract_skeleton(self):
pcd_points_current = pcd_points
while np.mean(M_list[-1]) / np.mean(M_list[0]) > self.param_termination_ratio:
pbar.set_description(
"Current volume ratio {}. Contraction weights {}. Attraction weights {}. Progress {}".format(
"Volume ratio: {}. Contraction weights: {}. Attraction weights: {}. Progress {}".format(
volume_ratio, np.mean(laplacian_weights), np.mean(positional_weights), self.algo_type))
logging.debug('Laplacian Weight: {}'.format(laplacian_weights))
logging.debug('Mean Positional Weight: {}'.format(np.mean(positional_weights)))
Expand Down Expand Up @@ -294,22 +295,22 @@ def extract_topology(self):

return self.topology

def save(self, output: str):
def export_results(self, output: str):
os.makedirs(output, exist_ok=True)
path_contracted_pcd = os.path.join(output, '01_point_cloud_contracted_{}'.format(self.algo_type) + '.ply')
o3d.io.write_point_cloud(path_contracted_pcd, self.contracted_point_cloud)

path_skeleton = os.path.join(output, '02_skeleton_{}'.format(self.algo_type) + '.ply')
o3d.io.write_point_cloud(path_skeleton, self.skeleton)
o3d.io.write_point_cloud(filename=path_skeleton, pointcloud=self.skeleton)

path_topology = os.path.join(output, '03_topology_{}'.format(self.algo_type) + '.ply')
o3d.io.write_line_set(path_topology, self.topology)
o3d.io.write_line_set(filename=path_topology, line_set=self.topology)

path_skeleton_graph = os.path.join(output, '04_skeleton_graph_{}'.format(self.algo_type) + '.gpickle')
nx.write_gpickle(self.skeleton_graph, path_skeleton_graph)
nx.write_gpickle(G=self.skeleton_graph, path=path_skeleton_graph)

path_topology_graph = os.path.join(output, '05_topology_graph_{}'.format(self.algo_type) + '.gpickle')
nx.write_gpickle(self.skeleton_graph, path_topology_graph)
nx.write_gpickle(G=self.topology_graph, path=path_topology_graph)


class LBC(LaplacianBasedContractionBase):
Expand Down Expand Up @@ -363,6 +364,18 @@ def __init__(self,
o3d.visualization.draw_geometries([pcd], window_name="Default Point Cloud")

def __least_squares_sparse(self, pcd_points, L, laplacian_weighting, positional_weighting):
"""
Perform least squares sparse solving for the Laplacian-based contraction.
Args:
pcd_points: The input point cloud points.
L: The Laplacian matrix.
laplacian_weighting: The Laplacian weighting matrix.
positional_weighting: The positional weighting matrix.
Returns:
The contracted point cloud.
"""
# Define Weights
WL = sparse.diags(laplacian_weighting) # I * laplacian_weighting
WH = sparse.diags(positional_weighting)
Expand Down Expand Up @@ -413,12 +426,12 @@ class SLBC(LaplacianBasedContractionBase):
Our semantic skeletonization algorithm based on Laplacian-Based Contraction.
Paper: tbd
Paper: https://arxiv.org/abs/2304.04708
"""

def __init__(self,
point_cloud: Union[str, open3d.geometry.PointCloud],
point_cloud: Union[str, dict],
semantic_weighting: float = 10.,
init_contraction: float = 1.,
init_attraction: float = 0.5,
Expand Down Expand Up @@ -463,6 +476,18 @@ def __init__(self,
o3d.visualization.draw_geometries([self.pcd], window_name="Default Point Cloud")

def __least_squares_sparse(self, pcd_points, L, laplacian_weighting, positional_weighting):
"""
Perform least squares sparse solving for the Semantic Laplacian-Based Contraction (S-LBC).
Args:
pcd_points: The input point cloud points.
L: The Laplacian matrix.
laplacian_weighting: The Laplacian weighting matrix.
positional_weighting: The positional weighting matrix.
Returns:
The contracted point cloud.
"""
# Define Weights
WL = sparse.diags(laplacian_weighting) # I * laplacian_weighting
WH = sparse.diags(positional_weighting)
Expand All @@ -483,6 +508,7 @@ def __least_squares_sparse(self, pcd_points, L, laplacian_weighting, positional_
num_valid = np.arange(0, pcd_points.shape[0])[mask]
S[rows, cols] = 1

# ToDo: Speed up!
for i in num_valid:
S[i, L[i].nonzero()[1]] = multiplier

Expand Down Expand Up @@ -531,22 +557,28 @@ def __least_squares_sparse(self, pcd_points, L, laplacian_weighting, positional_
pcd_branch = o3d.io.read_point_cloud(branch_pcd_path)
pcd = pcd_trunk + pcd_branch

# pcd = o3d.io.read_point_cloud("/home/luigi/Documents/reco/23_04_14/02/tree.ply")
# Laplacian-based Contraction
lbc = LBC(point_cloud=pcd, down_sample=0.01)
lbc.extract_skeleton()
lbc.extract_topology()
lbc.show_graph(lbc.skeleton_graph)
lbc.show_graph(lbc.topology_graph)
lbc.visualize()
lbc.save('./output')
lbc.animate(init_rot=np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0]]), steps=300, output='./output')

# Semantic Laplacian-based Contraction
s_lbc = SLBC(point_cloud={'trunk': pcd_trunk, 'branches': pcd_branch}, semantic_weighting=10, down_sample=0.01)
s_lbc.extract_skeleton()
s_lbc.extract_topology()
s_lbc.show_graph(s_lbc.skeleton_graph)
s_lbc.show_graph(s_lbc.topology_graph)
s_lbc.visualize()
s_lbc.save('./output')
s_lbc.animate(init_rot=np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0]]), steps=300, output='./output')
if False:
lbc = LBC(point_cloud=pcd, init_contraction=3.,
init_attraction=0.6,
max_contraction=2048,
max_attraction=1024,
down_sample=0.02)
lbc.extract_skeleton()
lbc.extract_topology()
lbc.show_graph(lbc.skeleton_graph)
lbc.show_graph(lbc.topology_graph)
lbc.visualize()
lbc.export_results('./output')
# lbc.animate(init_rot=np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0]]), steps=300, output='./output_1')
else:
# Semantic Laplacian-based Contraction
s_lbc = SLBC(point_cloud={'trunk': pcd_trunk, 'branches': pcd_branch}, semantic_weighting=10, down_sample=0.009)
s_lbc.extract_skeleton()
s_lbc.extract_topology()
s_lbc.show_graph(s_lbc.skeleton_graph)
s_lbc.show_graph(s_lbc.topology_graph)
s_lbc.visualize()
s_lbc.export_results('./output')
s_lbc.animate(init_rot=np.asarray([[1, 0, 0], [0, 0, 1], [0, 1, 0]]), steps=300, output='./output')
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
[build-system]
requires = ["setuptools>=42"]
build-backend = "setuptools.build_meta"
build-backend = "setuptools.build_meta"
11 changes: 6 additions & 5 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -1,12 +1,13 @@
numpy
robust_laplacian
robust_laplacian==0.2.4
scipy
matplotlib
open3d
open3d==0.16
tqdm
imageio
networkx
networkx==2.6.3
charset-normalizer
mistree==1.2.1
#mistree==1.2.1
pytest
sniffio
sniffio
mistree @ git+https://github.com/meyerls/mistree.git
Loading

0 comments on commit 665dcac

Please sign in to comment.