You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there a way to define the gaze prediction rate, or at least define a maximum?
Context:
I am currently setting up an online experiment with Webgazer running in the background, the prediction data of which I periodically add to an experiment data .csv file, itself saved once the experiment is finished.
The catch is that I am running into some memory issues (my task lasts about 45minutes in total), with WG and its predictions getting gradually slower and more laggy with progress in the task.
In an attempt to reduce the memory load, I figured setting the prediction rate to be slower than what it is currently out of the gate (which is more precision than I actually need) could be an option.
I have tried to find some info relating to this in your documentation but have not found anything, apologies if I missed something.
Questions:
Is there a prediction rate parameter somewhere that I missed
If not, could you point me to where I should make the edit in the WG.js script, or even better if you have a solution that could be implemented easily.
Also if you have any other tips to reduce the load on memory please let me know, it would be super useful!
Thanks a lot for your help and for making this tool available to us.
Tristan A. White PhD candidate
Lyon, France
The text was updated successfully, but these errors were encountered:
Hmm it seems the bigger issue here is the memory issues. I thought we had fixed the memory leak when WebGazer is running for longer than 30 minutes, but maybe there's another one. It might be a good idea for someone to investigate.
Is there a way to define the gaze prediction rate, or at least define a maximum?
Context:
I am currently setting up an online experiment with Webgazer running in the background, the prediction data of which I periodically add to an experiment data .csv file, itself saved once the experiment is finished.
The catch is that I am running into some memory issues (my task lasts about 45minutes in total), with WG and its predictions getting gradually slower and more laggy with progress in the task.
In an attempt to reduce the memory load, I figured setting the prediction rate to be slower than what it is currently out of the gate (which is more precision than I actually need) could be an option.
I have tried to find some info relating to this in your documentation but have not found anything, apologies if I missed something.
Questions:
Thanks a lot for your help and for making this tool available to us.
Tristan A. White
PhD candidate
Lyon, France
The text was updated successfully, but these errors were encountered: