Skip to content
This repository has been archived by the owner on Apr 2, 2018. It is now read-only.

Setting timer interval to 100ms causes high CPU usage #32

Open
gyk opened this issue Aug 23, 2017 · 2 comments
Open

Setting timer interval to 100ms causes high CPU usage #32

gyk opened this issue Aug 23, 2017 · 2 comments

Comments

@gyk
Copy link

gyk commented Aug 23, 2017

When calling timer.interval(Duration::from_millis(100)), the program takes up almost 100% of the CPU cycles, but after changing the interval from 100 to 101, the CPU usage dramatically drops to nealy 0%. Tested on macOS 10.12 with Rust 1.17.

Here is my code, you can reproduce it with cargo run --release 1000 100 and cargo run --release 1000 101.

@carllerche
Copy link
Member

Sounds like a bug!

@carllerche
Copy link
Member

This may be related to: #11

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants