Android Performance

Case Study: System-Wide Lag Caused by Android Accessibility Services

Word count: 1.6kReading time: 10 min
2019/01/21
loading

The Phenomemon

Users reported that scrolling through lists, whether on the home screen or in settings, occasionally felt jittery or “shaky.” This issue wasn’t universal but affected a subset of users consistently once it appeared.

For those in a hurry, you can skip to the “The Culprits” and “Self-Check” sections at the bottom. For those curious about the investigative process, let’s dive into the trace.

Systrace Analysis

We successfully reproduced the issue on a local test device. Analyzing the Systrace revealed that even simple flings on the desktop triggered significant frame drops.

The red arrows indicate dropped frames. From the buffer count, we can see that SurfaceFlinger (SF) isn’t rendering because the Launcher process isn’t submitting any buffers.

Looking at the Launcher’s trace, the lack of rendering is due to a lack of Input events. Without input events, the Launcher doesn’t know it needs to update its state or render a new frame.

Missing input events during a continuous swipe is a major red flag. Usually, a finger move should report points continuously. We checked if the touchscreen hardware was faulty but first decided to inspect the InputReader and InputDispatcher threads in system_server.

The trace shows InputReader is working fine, but InputDispatcher is behaving abnormally. In a healthy system, their cycles should look like this:

In the problematic trace, InputDispatcher has changed its rhythm to match the Vsync signal. Instead of being woken up by InputReader as soon as a point is read, it is being woken up by the System_Server UI thread during the Vsync pulse.

The next step was to dig into the code to find out why InputReader failed to wake up InputDispatcher.

Code Analysis

The logic for InputReader to wake InputDispatcher (for a Motion move event) is in InputDispatcher.cpp:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
void InputDispatcher::notifyMotion(const NotifyMotionArgs* args) {

if (!validateMotionEvent(args->action, args->actionButton,
args->pointerCount, args->pointerProperties)) {
return;
}

uint32_t policyFlags = args->policyFlags;
policyFlags |= POLICY_FLAG_TRUSTED;

android::base::Timer t;
mPolicy->interceptMotionBeforeQueueing(args->eventTime, /*byref*/ policyFlags);
if (t.duration() > SLOW_INTERCEPTION_THRESHOLD) {
ALOGW("Excessive delay in interceptMotionBeforeQueueing; took %s ms",
std::to_string(t.duration().count()).c_str());
}

bool needWake; //是否需要唤醒
{ // acquire lock
mLock.lock();

if (shouldSendMotionToInputFilterLocked(args)) {
mLock.unlock();

MotionEvent event;
event.initialize(args->deviceId, args->source, args->action, args->actionButton,
args->flags, args->edgeFlags, args->metaState, args->buttonState,
0, 0, args->xPrecision, args->yPrecision,
args->downTime, args->eventTime,
args->pointerCount, args->pointerProperties, args->pointerCoords);

policyFlags |= POLICY_FLAG_FILTERED;
// SystemServer 上层要对事件进行过滤
if (!mPolicy->filterInputEvent(&event, policyFlags)) {
return; // event was consumed by the filter
}

mLock.lock();
}

// Just enqueue a new motion event.
MotionEntry* newEntry = new MotionEntry(args->eventTime,
args->deviceId, args->source, policyFlags,
args->action, args->actionButton, args->flags,
args->metaState, args->buttonState,
args->edgeFlags, args->xPrecision, args->yPrecision, args->downTime,
args->displayId,
args->pointerCount, args->pointerProperties, args->pointerCoords, 0, 0);

needWake = enqueueInboundEventLocked(newEntry);
mLock.unlock();
} // release lock

if (needWake) {
mLooper->wake();
}
}

The key part is the filterInputEvent method. If it returns false, the entire function returns early, meaning InputDispatcher never gets woken up. Let’s trace this call:

JNI Layer (com_android_server_input_InputManagerService.cpp):

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
bool NativeInputManager::filterInputEvent(const InputEvent* inputEvent, uint32_t policyFlags) {
jobject inputEventObj;
JNIEnv* env = jniEnv();

switch (inputEvent->getType()) {
case AINPUT_EVENT_TYPE_KEY:
inputEventObj = android_view_KeyEvent_fromNative(env,
static_cast<const KeyEvent*>(inputEvent));
break;
case AINPUT_EVENT_TYPE_MOTION:
inputEventObj = android_view_MotionEvent_obtainAsCopy(env,
static_cast<const MotionEvent*>(inputEvent));
break;
default:
return true;
}

bool result = env->CallBooleanMethod(mServiceObj,
gServiceClassInfo.filterInputEvent, inputEventObj, policyFlags);

env->DeleteLocalRef(inputEventObj);
return result;
}

Java Layer (InputManagerService.java):

1
2
3
4
5
6
7
8
9
10
11
12
public boolean filterInputEvent(InputEvent event, int policyFlags) {
synchronized (mInputFilterLock) {
if (mInputFilter != null) {
try {
return mInputFilter.filterInputEvent(event, policyFlags);
} catch (RemoteException e) {
/* ignore */
}
}
return true;
}
}

InputFilter.java:

1
2
3
4
5
6
7
8
9
10
11
12
13
public boolean filterInputEvent(InputEvent event, int policyFlags) {
synchronized (mLock) {
if (mHost == null) {
return true;
}
try {
return mHost.filterInputEvent(event, policyFlags);
} catch (RemoteException e) {
/* ignore */
}
return true;
}
}

AccessibilityInputFilter.java:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
public boolean onInputEvent(InputEvent event, int policyFlags) {
if (mEventHandler == null) {
return true;
}

// Verify the event stream is consistent.
final int deviceId = event.getDeviceId();
if (mCurrentDeviceId >= 0 && mCurrentDeviceId != deviceId) {
resetStream();
}
mCurrentDeviceId = deviceId;

// Check if we have any features enabled.
final int featureMask = getEnabledFeatures();
if (featureMask == 0) {
return true;
}

// Handle the event.
switch (event.getType()) {
case InputEvent.TYPE_MOTION: {
MotionEvent motionEvent = (MotionEvent) event;
return processMotionEvent(motionEvent, policyFlags, featureMask);
}
case InputEvent.TYPE_KEY: {
KeyEvent keyEvent = (KeyEvent) event;
return processKeyEvent(keyEvent, policyFlags, featureMask);
}
default: {
return true;
}
}
}

The processMotionEvent method is where things get interesting:

1
2
3
4
5
6
7
8
9
10
private boolean processMotionEvent(MotionEvent event, int policyFlags, int featureMask) {
// Only process scroll events.
if (event.getActionMasked() != MotionEvent.ACTION_SCROLL) {
return true;
}

// Batch the event for later processing.
batchMotionEvent(event, policyFlags, featureMask);
return false; // Event consumed by the filter
}

And the batchMotionEvent method:

1
2
3
4
5
6
7
8
9
10
11
12
13
private void batchMotionEvent(MotionEvent event, int policyFlags, int featureMask) {
// Add the event to the batch.
if (mMotionEvent == null) {
mMotionEvent = MotionEvent.obtain(event);
} else {
mMotionEvent.addBatch(event);
}
mPolicyFlags = policyFlags;
mFeatureMask = featureMask;

// Schedule processing of batched events.
scheduleProcessBatchedEvents();
}

Finally, the critical scheduleProcessBatchedEvents method:

1
2
3
4
5
6
7
private void scheduleProcessBatchedEvents() {
if (!mProcessBatchedEventsScheduled) {
mProcessBatchedEventsScheduled = true;
mChoreographer.postCallback(Choreographer.CALLBACK_INPUT,
mProcessBatchedEventsRunnable, null);
}
}

The mProcessBatchedEventsRunnable is where the actual processing happens:

1
2
3
4
5
6
7
private final Runnable mProcessBatchedEventsRunnable = new Runnable() {
@Override
public void run() {
mProcessBatchedEventsScheduled = false;
processBatchedEvents();
}
};

The Root Cause

The issue is clear now: Accessibility services that monitor “Execute Gestures” cause all scroll events to be batched and processed on the next Vsync signal.

When an accessibility service enables the FEATURE_TOUCH_EXPLORATION or similar features that require monitoring gestures, the AccessibilityInputFilter intercepts all scroll events. Instead of passing them immediately to InputDispatcher, it batches them and schedules processing on the next Vsync pulse via mChoreographer.postCallback().

This creates a delay of up to 16.6ms (at 60Hz) between when the touchscreen reports a point and when the application receives it. During continuous scrolling, this delay accumulates, causing the jittery, “shaky” sensation users reported.

The Culprits

Any app that requests accessibility permissions and enables gesture monitoring can cause this issue. Common culprits include:

  1. “Cleaner” or “Battery Saver” apps that claim to optimize performance but actually degrade it
  2. “Gesture Navigation” enhancers that add custom swipe gestures
  3. “Screen Recording” or “Automation” tools that need to intercept touch events
  4. “Anti-Touch” or “Pocket Mode” apps that prevent accidental touches

Self-Check

If you’re experiencing similar jank issues, here’s how to diagnose them:

  1. Check installed accessibility services:

    • Go to Settings → Accessibility → Installed services
    • Disable all services and test scrolling performance
    • Re-enable services one by one to identify the culprit
  2. Use Systrace to verify:

    • Capture a trace during problematic scrolling
    • Look for InputDispatcher being woken by Vsync instead of InputReader
    • Check if AccessibilityInputFilter appears in the event chain
  3. Monitor with logcat:

    • Look for AccessibilityInputFilter logs
    • Check for FEATURE_TOUCH_EXPLORATION or similar feature flags

Why Do Apps Do This?

Many apps misuse accessibility services for legitimate but misguided reasons:

  1. Automation: Automating repetitive tasks by simulating gestures
  2. Analytics: Tracking user interactions for “improving the user experience”
  3. Ad avoidance: Automatically closing ads or pop-ups
  4. Accessibility features: Providing alternative input methods for users with disabilities

However, the privacy implications are severe. An app with accessibility permissions can:

  • Read everything on your screen
  • Capture all your keystrokes (including passwords and credit card numbers)
  • Control other apps by simulating taps and gestures
  • Monitor all your notifications

The specific feature causing our jank issue is “Execute Gestures” monitoring, which requires intercepting all touch events to analyze gesture patterns.

Solution

For users:

  1. Review accessibility permissions regularly and remove unnecessary ones
  2. Be skeptical of apps requesting accessibility access for seemingly unrelated features
  3. Use built-in alternatives when available (e.g., Android’s built-in gesture navigation)

For developers:

  1. Avoid accessibility APIs unless absolutely necessary for accessibility features
  2. Use appropriate APIs for automation (e.g., UiAutomator for testing)
  3. Consider performance implications of intercepting input events

For system integrators:

  1. Add warnings about performance impact when enabling gesture monitoring
  2. Consider rate-limiting or batching optimizations in the framework
  3. Monitor for abusive patterns in accessibility service usage

Conclusion

This case demonstrates how well-intentioned features (accessibility services) can be misused in ways that significantly degrade system performance. The architectural decision to batch gesture events for Vsync-aligned processing makes sense for gesture recognition accuracy but has unintended consequences for scrolling smoothness.

As Android continues to add features and complexity, understanding these cross-layer interactions becomes increasingly important for maintaining good user experience. Performance debugging often requires tracing issues across multiple system layers—from hardware touchscreens through kernel drivers, framework services, and finally to application rendering.

References

  1. Android Source Code - InputDispatcher.cpp
  2. Android Source Code - AccessibilityInputFilter.java
  3. Android Developer Guide - Accessibility Service

Zhihu Version

Since blog comments aren’t convenient for discussion, you can visit the Zhihu version of this article for likes and交流:

知乎 - Android 无障碍服务导致的整机卡顿案例分析

About Me && Blog

Below is my personal introduction and related links. I look forward to exchanging ideas with fellow professionals. “When three people walk together, one must be my teacher!”

  1. Blogger Introduction : Contains personal WeChat and WeChat group links.
  2. Blog Content Navigation : A navigation guide for my blog content.
  3. Curated Excellent Blog Articles - Android Performance Optimization Must-Knows : Welcome self-recommendations and recommendations (WeChat private chat is fine).
  4. Android Performance Optimization Knowledge Planet : Welcome to join, thanks for your support!

One walks faster alone, but a group walks further together.

Scan WeChat QR Code

CATALOG
  1. 1. The Phenomemon
  2. 2. Systrace Analysis
  3. 3. Code Analysis
  4. 4. The Root Cause
  5. 5. The Culprits
  6. 6. Self-Check
  7. 7. Why Do Apps Do This?
  8. 8. Solution
  9. 9. Conclusion
  10. 10. References
  11. 11. Zhihu Version
  12. 12. About Me && Blog