You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am currently working on a camera focusing project for microscopes in area of pathology and I am trying to simulate the focusing behaviour in webots. I use a camera object (FoV: 2.4, near: 0.01) with a focus node (focal length: 0.001, focal distance: 0.05, maxFocalDistance: 1, minFocalDistance: 0.01) that looks at a display object that is fixed in position and displays tissue samples.
In my simulation the camera moves towards and away from the display in a fixed step size of 0.001 in range of 0.029 (minimum distance to the display) and 0.032 (maximum distance to the display). At each step the camera takes a picture and saves it. The optimal focus point is at a distance of 0.0305.
In certain range of distance the defocused images look as I would expect it to look like (roughly around 0.0296 and 0.0315), however when I move the camera further away the images get an usual high contrast especially around the edges of the objects (in my case the tissue samples). Can anyone give me an explanation for this behaviour of the focus of the camera? Thanks a lot in advance!
Image of the optimal focus point at a distance of 0.0305 to the display:
Image with expected de-focus behaviour of the camera at distance of 0.0297:
Image with unexpected de-focus behaviour of the camera at distance of 0.0291 (high contrast around edges):
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi everyone,
I am currently working on a camera focusing project for microscopes in area of pathology and I am trying to simulate the focusing behaviour in webots. I use a camera object (FoV: 2.4, near: 0.01) with a focus node (focal length: 0.001, focal distance: 0.05, maxFocalDistance: 1, minFocalDistance: 0.01) that looks at a display object that is fixed in position and displays tissue samples.
In my simulation the camera moves towards and away from the display in a fixed step size of 0.001 in range of 0.029 (minimum distance to the display) and 0.032 (maximum distance to the display). At each step the camera takes a picture and saves it. The optimal focus point is at a distance of 0.0305.
In certain range of distance the defocused images look as I would expect it to look like (roughly around 0.0296 and 0.0315), however when I move the camera further away the images get an usual high contrast especially around the edges of the objects (in my case the tissue samples). Can anyone give me an explanation for this behaviour of the focus of the camera? Thanks a lot in advance!
Image of the optimal focus point at a distance of 0.0305 to the display:
Image with expected de-focus behaviour of the camera at distance of 0.0297:
Image with unexpected de-focus behaviour of the camera at distance of 0.0291 (high contrast around edges):
Beta Was this translation helpful? Give feedback.
All reactions