You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Aug 5, 2024. It is now read-only.
I'm using python3.My mac is 16GB unified memory,1TB SSD storage.Each file is 750KB,it's two different files. When run diff_main will cost 98.6% cpu up to 20 seconds.Is this normal or needs to be optimized?
The text was updated successfully, but these errors were encountered:
@oliu this is likely due to the documents being diffed, as performance will be heavily impacted by the content being diffed.
if you have optimizations you'd like to propose please don't hesitate to share them. this library already performs a handful of optimizations that tradeoff the "minimum edit script" between two input documents in order to avoid catastrophic runtime performance, though it also includes cleanup passes to make the final edit script easier to recognize as a human (e.g. preferring whole-word diffs over zig-zagging character diffs within a word)
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I'm using python3.My mac is 16GB unified memory,1TB SSD storage.Each file is 750KB,it's two different files. When run diff_main will cost 98.6% cpu up to 20 seconds.Is this normal or needs to be optimized?
The text was updated successfully, but these errors were encountered: