Hand tracking in HoloLens2 is awesome but detecting gestures is not. Currently, HoloLens allows the following in-built hand gestures,

HoloLens 1

https://www.researchgate.net/publication/329148998/figure/fig2/AS:696116504973318@1542978516912/The-common-air-tap-gesture-used-in-the-HoloLens-application.png

HoloLens 2

https://docs.microsoft.com/en-us/hololens/images/hololens-2-start-alternative.png

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/b7bc3bb9-b018-4fd5-84fe-b8c57d9e021e/hololens.jpg

Hand Tracking

If no controllers are present then the default controllers are the user's hands.

MixedRealityPose pose;
if (HandJointUtils.TryGetJointPose(TrackedHandJoint.IndexTip, Handedness.Right, out pose))
{
	// Use pose object to access the position and orientation of IndexTip
}

Gesture Detection

HoloLens uses *Pointers* to focus on different GameObjects.

https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/Images/Pointers/MRTK_Pointer_Main.png

public class FiringProjectiles: Singleton<FiringProjectiles>, IMixedRealityPointerHandler
{
	void Awake()
  {
      CoreServices.InputSystem?.RegisterHandler<IMixedRealityPointerHandler>(this);
  }
	
	void IMixedRealityPointerHandler.OnPointerUp(MixedRealityPointerEventData eventData)
  {
      // Requirement for implementing the interface
  }

  void IMixedRealityPointerHandler.OnPointerDown(
       MixedRealityPointerEventData eventData)
  {
      // Requirement for implementing the interface
  }

  void IMixedRealityPointerHandler.OnPointerDragged(
       MixedRealityPointerEventData eventData)
  {
      // Requirement for implementing the interface
  }

  // Detecting the air tap gesture
  void IMixedRealityPointerHandler.OnPointerClicked(MixedRealityPointerEventData eventData)
	{
			if (eventData.InputSource.SourceName == 'Right Hand' || eventData.InputSource.SourceName == 'Mixed Reality Controller Right')
			{
					// Do something when the user does an air tap using their right hand only
			}
	}
}