Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gesture.Tap() behaving weirdly, one finger needs to be on screen to receive onStart and onEnd events #3048

Open
ashkalor opened this issue Aug 14, 2024 · 9 comments
Labels
Platform: Android This issue is specific to Android Platform: iOS This issue is specific to iOS Repro provided A reproduction with a snack or repo is provided

Comments

@ashkalor
Copy link

ashkalor commented Aug 14, 2024

Description

Hey RNGH developers,

I have been using react native gesture handler for quite sometime now and it's doing an amazing job, However recently I have been experimenting with React three Fiber in an expo react native project.

For the tap gesture, only the onBegin callback is triggered, and no matter how many times I tap, I only receive onBegin events. After further experimentation, I realized that for the onStart and onEnd callbacks to be fired, one finger needs to be placed on the screen before performing the normal tap motion. This triggers the onStart and onEnd callbacks.

Below is my code snippet and a snack link for experimentation.

const tap = Gesture.Tap()
    .onBegin(() => {
      console.log("Begin");
    })
    .onStart((event) => {
      console.log("start");
    })
    .onEnd((event) => {
      console.log("end");
    });
  return (
    <GestureDetector gesture={tap}>
      <View style={{ flex: 1 }}>
        <Canvas
          style={{
            flex: 1,
          }}
        >
          <ambientLight intensity={0.1} />
          <directionalLight color="red" position={[0, 0, 5]} />
          <mesh
            onClick={() => {
              console.log("mesh click");
            }}
          >
            <boxGeometry />
            <meshStandardMaterial />
          </mesh>
        </Canvas>
      </View>
    </GestureDetector>
  );

Hoping to really get this solved, I believe providing support for React Native Gesture Handler with react three fiber will really help in the upcoming days as support for react-native-wgpu is announced and three js already has running examples for using WebGPU and also since its the only library that has both internal and external interaction system in 3D space for react native.

Warning

Trying to run the latest versions of react three fiber will throw this particular error raised in issue . I would recommend using a real device so that you can dismiss these errors on the error popup screen.

Steps to reproduce

  1. Install Expo React native project with React Three Fiber and React native gesture Handler
  2. Copy the above code and place it on index file after wrapping your layout with GestureHandlerRootView
  3. Test tapping the screen to get onBegin callbacks
  4. Place one finger and try tapping again to get onStart and onEnd callbacks (which is not the expected behaviour )

Snack or a link to a repository

https://snack.expo.dev/@ashkalor/authentic-blue-salsa

Gesture Handler version

2.18.1

React Native version

0.74.5

Platforms

Android, iOS

JavaScript runtime

None

Workflow

Expo bare workflow

Architecture

Paper (Old Architecture)

Build type

Debug mode

Device

Real device

Device model

Samsung Galaxy Tab A9

Acknowledgements

Yes

@github-actions github-actions bot added Platform: Android This issue is specific to Android Platform: iOS This issue is specific to iOS Repro provided A reproduction with a snack or repo is provided labels Aug 14, 2024
@ashkalor ashkalor changed the title Gesture.Tap() behaving weirdly, one finger needs to be on screen to recieve start and end events Gesture.Tap() behaving weirdly, one finger needs to be on screen to receive onStart and onEnd events Aug 14, 2024
@ashkalor
Copy link
Author

Hey guys,
Any update on this?
Really stuck here. Don't know where to even start debugging from.

@NikitaDudin
Copy link

Try to replace first child of GestureDetector with Animated.View from react-native-reanimated library. it should be REAnimated node. https://docs.swmansion.com/react-native-gesture-handler/docs/gestures/gesture-detector/#reference

@leaf541
Copy link

leaf541 commented Sep 18, 2024

I'm facing the same issue, any possible fixes? I already have first child as Reanimated Animated.View.

@TweetyBoop1990
Copy link

Has anyone figured out a fix for this? I have pretty much this issue. Everything was working fine while on RN 0.73.6 and then I upgraded to 0.76.2 and now my gestures require a finger to be on the screen for anything to activate (on ANDROID only). What could be causing this?!?

@j-piasecki
Copy link
Member

Hey, sorry for the delay - we were looking into this issue but no update was posted here 😅.

The problem here is twofold - threejs/native uses JS responder to handle touches and RNGH has a binary relation with it. When the JS responder is handling touches, gesture handler isn't and vice versa. Which one gets priority depends on which one will activate first. In this specific case, the js responder activates first, so the tap gesture is always canceled.

The other part is that RNGH respects the platform's native touch handling. On Android, this comes down to handling requestDisallowInterceptTouchEvent, which also may be triggered by the JS responder and the only information it gives is that the view should stop intercepting touch events. It doesn't say what triggered it.

We don't really have a reliable way to detect that these two are connected, so we respect both and cancel the gestures if they aren't active yet to prevent breaking the touch handling of JS responder and native Android views.

This is related to more broad category of issues in RNGH which we are still investigating and looking for a good solution. The bad news is that it will likely be a breaking change and require the new architecture.

If you need a workaround for it, you can try disabling the mechanism I mentioned earlier by removing these lines:

if (_enabled) {
rootHelper!!.requestDisallowInterceptTouchEvent()
}

if (blockNativeResponder) {
UiThreadUtil.runOnUiThread { tryCancelAllHandlers() }
}

But keep in mind that it can lead to problems in other places Gesture Handler is used. If you find any issues while using this workaround I would appreciate it if you could describe them here as it may help us in handling this in a better way.

@ashkalor
Copy link
Author

ashkalor commented Nov 22, 2024

So based on my understanding, correct me If i am wrong - Instead of doing the workaround you mentioned if we port r3f/native to support RNGH instead of the JS responder then we should be able use all of RNGH features with it right?
The events system for native in React three fiber is in a way emulated from JS responder system.
Am I thinking in the right direction or is there a better way to go about this?

Will this also allow us to nest different gestures inside the canvas?

@j-piasecki
Copy link
Member

I believe that's correct, though it would be great to test whether disabling the responder system makes gestures work to make sure we didn't miss anything.

@ashkalor
Copy link
Author

I can definitely help but I haven't really worked a lot directly with kotlin or java I am a bit skeptical about putting my hands into that.
Based on what you mentioned
Do you want me to try disabling the RNGH for the above example and just test with JS responder?

@j-piasecki
Copy link
Member

I meant to check whether removing the onPress implementation from r3f makes RNGH work (this should be entirely in JS, though I'm unfamiliar with the codebase). Sorry for not being clear 😅.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Platform: Android This issue is specific to Android Platform: iOS This issue is specific to iOS Repro provided A reproduction with a snack or repo is provided
Projects
None yet
Development

No branches or pull requests

5 participants