Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Batched Mode #15502

Merged
merged 4 commits into from
May 13, 2019
Merged

Add Batched Mode #15502

merged 4 commits into from
May 13, 2019

Conversation

acdlite
Copy link
Collaborator

@acdlite acdlite commented Apr 25, 2019

React has an unfortunate quirk where updates are sometimes synchronous — where React starts rendering immediately within the call stack of setState — and sometimes batched, where updates are flushed at the end of the current event. Any update that originates within the call stack of the React event system is batched. This encompasses most updates, since most updates originate from an event handler like onClick or onChange. It also includes updates triggered by lifecycle methods or effects. But there are also updates that originate outside React's event system, like timer events, network events, and microtasks (promise resolution handlers). These are not batched, which results in both worse performance (multiple render passes instead of single one) and confusing semantics.

Ideally all updates would be batched by default. Unfortunately, it's easy for components to accidentally rely on this behavior, so changing it could break existing apps in subtle ways.

One way to move to a batched-by-default model is to opt into Concurrent Mode (still experimental). But Concurrent Mode introduces additional semantic changes that apps may not be ready to adopt.

This commit introduces an additional mode called Batched Mode. Batched Mode enables a batched-by-default model that defers all updates to the next React event. Once it begins rendering, React will not yield to the browser until the entire render is finished.

Batched Mode is superset of Strict Mode. It fires all the same warnings. It also drops the forked Suspense behavior used by Legacy Mode, in favor of the proper semantics used by Concurrent Mode.

I have not added any public APIs that expose the new mode yet. I'll do that in subsequent commits.

React has an unfortunate quirk where updates are sometimes synchronous
-- where React starts rendering immediately within the call stack of
`setState` — and sometimes batched, where updates are flushed at the
end of the current event. Any update that originates within the call
stack of the React event system is batched. This encompasses most
updates, since most updates originate from an event handler like
`onClick` or `onChange`. It also includes updates triggered by lifecycle
methods or effects. But there are also updates that originate outside
React's event system, like timer events, network events, and microtasks
(promise resolution handlers). These are not batched, which results in
both worse performance (multiple render passes instead of single one)
and confusing semantics.

Ideally all updates would be batched by default. Unfortunately, it's
easy for components to accidentally rely on this behavior, so changing
it could break existing apps in subtle ways.

One way to move to a batched-by-default model is to opt into Concurrent
Mode (still experimental). But Concurrent Mode introduces additional
semantic changes that apps may not be ready to adopt.

This commit introduces an additional mode called Batched Mode. Batched
Mode enables a batched-by-default model that defers all updates to the
next React event. Once it begins rendering, React will not yield to
the browser until the entire render is finished.

Batched Mode is superset of Strict Mode. It fires all the same warnings.
It also drops the forked Suspense behavior used by Legacy Mode, in favor
of the proper semantics used by Concurrent Mode.

I have not added any public APIs that expose the new mode yet. I'll do
that in subsequent commits.
@sizebot
Copy link

sizebot commented Apr 25, 2019

Details of bundled changes.

Comparing: 3f058de...e4fdd20

react-art

File Filesize Diff Gzip Diff Prev Size Current Size Prev Gzip Current Gzip ENV
react-art.development.js 0.0% +0.1% 557.41 KB 557.52 KB 121.73 KB 121.85 KB UMD_DEV
react-art.production.min.js 0.0% 0.0% 95.59 KB 95.61 KB 29.32 KB 29.32 KB UMD_PROD
react-art.development.js 0.0% +0.1% 488.43 KB 488.55 KB 104.34 KB 104.46 KB NODE_DEV
react-art.production.min.js 0.0% -0.0% 60.57 KB 60.59 KB 18.64 KB 18.64 KB NODE_PROD
ReactART-dev.js 0.0% +0.1% 497.55 KB 497.6 KB 103.42 KB 103.52 KB FB_WWW_DEV
ReactART-prod.js -0.0% -0.0% 195.02 KB 195 KB 33.18 KB 33.17 KB FB_WWW_PROD

react-native-renderer

File Filesize Diff Gzip Diff Prev Size Current Size Prev Gzip Current Gzip ENV
ReactNativeRenderer-dev.js 0.0% +0.1% 629.11 KB 629.11 KB 133.6 KB 133.72 KB RN_FB_DEV
ReactNativeRenderer-prod.js -0.0% -0.0% 243.9 KB 243.79 KB 42.37 KB 42.35 KB RN_FB_PROD
ReactNativeRenderer-profiling.js -0.0% -0.1% 252.08 KB 251.98 KB 43.94 KB 43.91 KB RN_FB_PROFILING
ReactNativeRenderer-dev.js 0.0% +0.1% 629.02 KB 629.02 KB 133.57 KB 133.68 KB RN_OSS_DEV
ReactNativeRenderer-prod.js -0.0% -0.0% 243.92 KB 243.81 KB 42.36 KB 42.35 KB RN_OSS_PROD
ReactNativeRenderer-profiling.js -0.0% -0.1% 252.1 KB 251.99 KB 43.94 KB 43.91 KB RN_OSS_PROFILING
ReactFabric-dev.js 0.0% +0.1% 617.79 KB 617.79 KB 130.89 KB 131 KB RN_FB_DEV
ReactFabric-prod.js -0.0% -0.0% 237.66 KB 237.55 KB 41.1 KB 41.08 KB RN_FB_PROD
ReactFabric-profiling.js -0.0% -0.1% 245.31 KB 245.21 KB 42.71 KB 42.69 KB RN_FB_PROFILING
ReactFabric-dev.js 0.0% +0.1% 617.7 KB 617.7 KB 130.86 KB 130.97 KB RN_OSS_DEV
ReactFabric-prod.js -0.0% -0.0% 237.67 KB 237.56 KB 41.09 KB 41.07 KB RN_OSS_PROD
ReactFabric-profiling.js -0.0% -0.1% 245.33 KB 245.22 KB 42.71 KB 42.68 KB RN_OSS_PROFILING

react-test-renderer

File Filesize Diff Gzip Diff Prev Size Current Size Prev Gzip Current Gzip ENV
react-test-renderer.development.js 0.0% +0.1% 501.95 KB 502.08 KB 107.06 KB 107.18 KB UMD_DEV
react-test-renderer.production.min.js 🔺+0.1% -0.0% 61.79 KB 61.83 KB 18.97 KB 18.96 KB UMD_PROD
react-test-renderer.development.js 0.0% +0.1% 497.49 KB 497.62 KB 105.95 KB 106.07 KB NODE_DEV
react-test-renderer.production.min.js 🔺+0.1% 🔺+0.1% 61.47 KB 61.51 KB 18.82 KB 18.83 KB NODE_PROD
ReactTestRenderer-dev.js 0.0% +0.1% 508.24 KB 508.33 KB 105.66 KB 105.76 KB FB_WWW_DEV
react-test-renderer-shallow.development.js 0.0% 0.0% 41.88 KB 41.88 KB 10.71 KB 10.71 KB UMD_DEV

react-noop-renderer

File Filesize Diff Gzip Diff Prev Size Current Size Prev Gzip Current Gzip ENV
react-noop-renderer.development.js +3.6% +2.4% 35.15 KB 36.4 KB 8.71 KB 8.91 KB NODE_DEV
react-noop-renderer.production.min.js 🔺+5.2% 🔺+2.3% 10.49 KB 11.03 KB 3.45 KB 3.53 KB NODE_PROD
react-noop-renderer-persistent.development.js +3.5% +2.4% 35.26 KB 36.51 KB 8.72 KB 8.93 KB NODE_DEV
react-noop-renderer-persistent.production.min.js 🔺+5.1% 🔺+2.3% 10.51 KB 11.05 KB 3.46 KB 3.54 KB NODE_PROD
react-noop-renderer-server.development.js 0.0% +0.1% 1.83 KB 1.83 KB 875 B 876 B NODE_DEV
react-noop-renderer-server.production.min.js 0.0% 🔺+0.2% 813 B 813 B 489 B 490 B NODE_PROD

react-reconciler

File Filesize Diff Gzip Diff Prev Size Current Size Prev Gzip Current Gzip ENV
react-reconciler.development.js 0.0% +0.1% 485.96 KB 486.03 KB 102.77 KB 102.87 KB NODE_DEV
react-reconciler.production.min.js 0.0% 🔺+0.2% 61.51 KB 61.53 KB 18.38 KB 18.42 KB NODE_PROD
react-reconciler-persistent.development.js 0.0% +0.1% 483.87 KB 483.93 KB 101.89 KB 101.99 KB NODE_DEV
react-reconciler-persistent.production.min.js 0.0% 🔺+0.2% 61.52 KB 61.54 KB 18.39 KB 18.43 KB NODE_PROD
react-reconciler-reflection.production.min.js 0.0% 🔺+0.1% 2.43 KB 2.43 KB 1.09 KB 1.09 KB NODE_PROD

Generated by 🚫 dangerJS

Should have same semantics as Concurrent Mode.
@sophiebits
Copy link
Collaborator

Yasss. I found myself wishing for batching literally earlier today. Hope this gets released soon.

Why couple it to strict mode? It doesn't require the strict semantics, does it? It feels to me like strict and batched are orthogonal (but both required for concurrent compatibility).

  • strict: (compatibility with) once a render is started, it might not be always finished
  • batched: once setState is called, there might be time before the state update happens (already the case in an event handler)

I am wary of too many flags though… 😬

@acdlite
Copy link
Collaborator Author

acdlite commented Apr 25, 2019

@sophiebits It does require Strict Mode because of Suspense. The quirky, forked semantics we use for Suspense in Legacy Mode today aren't compatible with the newer features we're planning to add, so we want to move everyone over to the "proper" semantics used by Concurrent Mode instead.

I am wary of too many flags though… 😬

Yeah I share this concern.

@sebmarkbage
Copy link
Collaborator

sebmarkbage commented Apr 25, 2019

Suspense in Legacy Mode today aren't compatible with the newer features

The more immediate problem is that it isn't compatible across mode but also the Legacy Mode experience is much worse semantics. E.g. it breaks a lot of subtle patterns like that you can't rely on layout being available in componentDidMount, all refs possibly being null for a bit. This makes it much harder than it needs to be to write other code in a Suspense tree, but also if you write code that is compatible with Batched or Concurrent Mode, it might not be compatible in Legacy Mode (as in it'll crash). So the downgrade scenario is really bad.

I like to think of Suspense not existing in Legacy Mode. It's just a Batched (Strict) Mode feature.

To me, that's the main motivation for this mode.

Copy link
Collaborator

@sebmarkbage sebmarkbage left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We're still planning on dropping the support for nested <ConcurrentMode /> right? (Because the semantics are currently wrong and really complicated to even understand at the edges, and you often don't end up needing or being able to use it anyway. The alternative is a nested root and manually forwarded contexts.)

In that case it seems like this flag doesn't need to be on a per-fiber level. That lets us get rid of that field in prod. That implementation would look very different so is this only phase one?

@acdlite
Copy link
Collaborator Author

acdlite commented Apr 25, 2019

I don't think the implementation would look that different. If/when we make it per root, I would replace fiber.mode & BatchedMode with something more like root.mode & BatchedMode.

@acdlite
Copy link
Collaborator Author

acdlite commented Apr 26, 2019

To-dos after discussing with @sebmarkbage:

  • We're planning to get rid of the <ConcurrentMode /> component API in favor of opting in the entire root. That means we don't need to store the mode on the Fiber. I'm going to move that work into a separate PR, but in anticipation of that change, the createContainer API exposed by the reconciler should accept an enum of different root types instead of separate isBatched and isConcurrent booleans, which don't make sense because you can't actually mix them. I had previously considered that this should be a bitmask, but since there are only three distinct types (legacy, batched, and concurrent), an enum is probably less confusing.
  • Batched Mode should support deprioritizing hidden trees, though React still shouldn't yield when it comes back to work on the next level.
  • Batched Mode should schedule work at the same priority level that would be schedulde in Concurrent Mode. This means that Batched Mode is almost identical to the disableYielding feature flag, except it's per root and it also affects Suspense semantics.

@sebmarkbage
Copy link
Collaborator

sebmarkbage commented Apr 26, 2019

Batched Mode should schedule work at the same priority level that would be schedule in Concurrent Mode.

Wait this isn't quite right. Batched Mode should only ever have three different expiration times. Sync, Batched or Offscreen. All of which are fixed constants.

Scheduler.next, high vs normal pri event etc. shouldn't matter. It's all just one batch. (Discrete matters for when we flush early though. Batched is discrete.)

The level scheduled with the scheduler is kind of arbitrary. Can be Normal for Batched. Offscreen can be lower I guess.

@acdlite
Copy link
Collaborator Author

acdlite commented Apr 26, 2019

That's what I meant, I'm not getting rid of the Batched expiration time. I only meant I'm going to use the same Scheduler priority as would otherwise be scheduled, until we give more thought to designing the different levels.

There are three types of roots: Legacy, Batched, and Concurrent.
@acdlite
Copy link
Collaborator Author

acdlite commented Apr 29, 2019

Changed my mind. I think Scheduler priority for Batched updates should be Immediate (except for hidden subtrees which will be downgraded to Idle). Mostly for implementation reasons: need to be able to infer the priority level given an expiration time (inferPriorityFromExpirationTime). I think there's an argument that it should be Normal, and another argument that Immediate actually does make sense, but the priority system is underdesigned right now anyway, so I'm going to go with the thing that is most straightforward to implement.

@acdlite acdlite force-pushed the batched-mode branch 2 times, most recently from e7835cf to 4f5545b Compare April 29, 2019 19:42
Treat Sync and Batched expiration times separately. Only Sync updates
are pushed to our internal queue of synchronous callbacks.

Renamed `flushImmediateQueue` to `flushSyncCallbackQueue` for clarity.
@acdlite
Copy link
Collaborator Author

acdlite commented Apr 29, 2019

I'm going to deal with hidden to a separate PR because I want to remove <ConcurrentMode /> first.

@acdlite
Copy link
Collaborator Author

acdlite commented Apr 29, 2019

So this is ready for another review.

@NE-SmallTown
Copy link
Contributor

NE-SmallTown commented May 6, 2019

Ideally all updates would be batched by default. Unfortunately, it's easy for components to accidentally rely on this behavior, so changing it could break existing apps in subtle ways.

I'm a little confused here, how could you do to implement this? I.e. if something out of control of React(timer, network...), how to make it still be batched?(I mean, not more extra api like flush()).

You said one way is to use Concurrent Mode but from my personal perspective, the words above is tend to represent that there are some ways to make batched default and we don't need to do anything, that's not true because we must use some opt-in ways

@acdlite acdlite merged commit 862f499 into facebook:master May 13, 2019
@NE-SmallTown
Copy link
Contributor

What does 'React event' mean in 'to the next React event'?

NMinhNguyen referenced this pull request in enzymejs/react-shallow-renderer Jan 29, 2020
* Add Batched Mode

React has an unfortunate quirk where updates are sometimes synchronous
-- where React starts rendering immediately within the call stack of
`setState` — and sometimes batched, where updates are flushed at the
end of the current event. Any update that originates within the call
stack of the React event system is batched. This encompasses most
updates, since most updates originate from an event handler like
`onClick` or `onChange`. It also includes updates triggered by lifecycle
methods or effects. But there are also updates that originate outside
React's event system, like timer events, network events, and microtasks
(promise resolution handlers). These are not batched, which results in
both worse performance (multiple render passes instead of single one)
and confusing semantics.

Ideally all updates would be batched by default. Unfortunately, it's
easy for components to accidentally rely on this behavior, so changing
it could break existing apps in subtle ways.

One way to move to a batched-by-default model is to opt into Concurrent
Mode (still experimental). But Concurrent Mode introduces additional
semantic changes that apps may not be ready to adopt.

This commit introduces an additional mode called Batched Mode. Batched
Mode enables a batched-by-default model that defers all updates to the
next React event. Once it begins rendering, React will not yield to
the browser until the entire render is finished.

Batched Mode is superset of Strict Mode. It fires all the same warnings.
It also drops the forked Suspense behavior used by Legacy Mode, in favor
of the proper semantics used by Concurrent Mode.

I have not added any public APIs that expose the new mode yet. I'll do
that in subsequent commits.

* Suspense in Batched Mode

Should have same semantics as Concurrent Mode.

* Use RootTag field to configure type of root

There are three types of roots: Legacy, Batched, and Concurrent.

* flushSync should not flush batched work

Treat Sync and Batched expiration times separately. Only Sync updates
are pushed to our internal queue of synchronous callbacks.

Renamed `flushImmediateQueue` to `flushSyncCallbackQueue` for clarity.
@gaearon gaearon mentioned this pull request Mar 29, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants