Has anyone figured out a way to reliably locate and interact with the assistive touch button on iOS?

I have some tests that require a shake gesture, and tend to favor physical device execution when I can.

I have code in place to leverage the accessibility shortcut functionality that allows me to triple tap the side/power button to turn on/off the assistive touch at will.

However, I can’t find anyway to actually locate the assistive touch button on screen. Doesn’t show up in page source. As a workaround, I moved the button to the bottom right corner on all test devices. Annoying, but technically works.

I’d love to find a more reliable solution other than assuming its in the corner and just tapping blindly. It must have some accessibility info as turning on VoiceOver allows the user to “find” the assistive touch button.

Edit: Using Xcode’s accessibility inspector it can also find the button and it seems to have plenty of info, including an identifier of ‘AssistiveTouchNubbit’ but for some reason Appium can’t see it

Thanks.

It might be this button belongs to a different app. You can switch between active apps using defaultActiveApplication/activeAppDetectionPoint settings

1 Like

Way late, but I finally had time to revisit this and you were spot on.

The trick was to change the defaultActiveApplication to com.apple.assistivetouchd using the Settings API. After that, the driver was able to find the ‘AssistiveTouchNubbit’

Note that I was able to figure out we can instantly query the status of AssitiveTouch with ‘mobile: queryAppState’ {“bundleId”:“com.apple.assistivetouchd”}.

Combined with mobile:performIoHidEvent to enable/disable assistive touch we can automate a real device shake gesture reasonably well now with little hit to performance or flakiness