Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Standardize geometric data to increase robustness of polyskeleton. #38

Open
saeranv opened this issue Jun 9, 2020 · 0 comments
Open
Labels
enhancement New feature or request

Comments

@saeranv
Copy link
Member

saeranv commented Jun 9, 2020

Currently the straight skeleton being generated in the polyskel function fails at a certain level of complexity. A majority of these failures seem to be related to the changes in point equivalence tolerances that occur as the geometry changes size, which may interact poorly with the spatial hashing tolerances that I use in the graph algorithms.

A possible solution is to standardize, and center the coordinate data. An initial test resolved all the failing tests I had. Since the point equivalence is based on floating point precision, a robust scale would be to normalize all data between 0 and some number, and then move the "origin" to the (1, 1) coordinate. The decision of what to scale the geometry to is then just a question of identifying the scale that results in solving the most geometries. Based on initial tests, a number between 1 and 10 seems best.

@saeranv saeranv added the enhancement New feature or request label Jun 9, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant