Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RFC: Improvements to a compiler to make it fast #621

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

layershifter
Copy link
Member

@layershifter layershifter commented Nov 14, 2024

Griffel features a compiler (available as a Webpack loader, Vite plugin, etc.) that executes Ahead-of-Time (AOT) compilation to enhance runtime performance and shift workload to the build phase. The compiler also enables CSS extraction, offering significant performance benefits for applications.

However, we got multiple reports indicate that the compiler's speed is suboptimal. This project seeks to enhance the compiler's performance.

Preview

Copy link

📊 Bundle size report

✅ No changes found

@layershifter layershifter marked this pull request as ready for review November 14, 2024 15:57
@layershifter layershifter requested a review from a team as a code owner November 14, 2024 15:57

### Changes in tooling

We will adopt Rust as the primary language for the next-gen compiler and use [`napi-rs`](https://github.com/napi-rs/napi-rs) to create Node.js bindings.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a thought, feel free to disregard entirely 😃

You could adopt Rust as the primary language and target WASM. By doing this you avoid the need to build binaries for all platforms and open the door for running the compiler in the browser (though this seems like a neat thing rather than something immediately useful). This potentially comes at the cost of some raw performance but I would expect is to be significantly faster than the status quo.

Rust has good tooling for WASM: https://rustwasm.github.io/docs/wasm-bindgen/

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's what @Anber proposed yesterday 🐱 It's what we will need to check once we will have something working.

The problem with WASM that we faced before with @ling1726 - the cost of moving strings between JS & WASM worlds. Longer strings => worse results, as the following happens:

for (let i = index; i < index + size; ++i)
  s += String.fromCharCode(buffer[i]);

https://stackoverflow.com/a/41372484/6488546

AFAIK there is nothing better yet than messing up with Uint8Arrays that was proven to be extremely slow when we re-implemented @emotion/hash with WASM and managed to get faster than JS only with a custom memory allocator.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure if this helps, but the wasm-bindgen str docs state:

If you don't want to perform this copy, and would rather work with handles to JavaScript string values, use the js_sys::JsString type.

Which implies you can avoid the copy and still work with the string on the WASM/Rust side of the house

https://rustwasm.github.io/wasm-bindgen/api/js_sys/struct.JsString.html

Copy link
Member

@ling1726 ling1726 Nov 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In order for strings to exist in the wasm shared memory (which is essentially numbers) serialization with shared buffers still need to happen. Once wasm-bindgen compiles the output will output something like this

function passStringToWasm0(arg, malloc, realloc) {

    if (typeof(arg) !== 'string') throw new Error(`expected a string argument, found ${typeof(arg)}`);

    if (realloc === undefined) {
        const buf = cachedTextEncoder.encode(arg);
        const ptr = malloc(buf.length, 1) >>> 0;
        getUint8ArrayMemory0().subarray(ptr, ptr + buf.length).set(buf);
        WASM_VECTOR_LEN = buf.length;
        return ptr;
    }

    let len = arg.length;
    let ptr = malloc(len, 1) >>> 0;

    const mem = getUint8ArrayMemory0();

    let offset = 0;

    for (; offset < len; offset++) {
        const code = arg.charCodeAt(offset);
        if (code > 0x7F) break;
        mem[ptr + offset] = code;
    }
   // etc....

If wasm-bindgen has fs access - it might be possible to only do serialization into a buffer once, and write the output from the wasm side... but that text encoding will still need to happen to make sure the string exists in shared mem in the first place. I believe 'copy' inferred here means that a separate string exists in wasm memory 😢

This introduces complexity. Internally, the compiler functions as a micro-bundler that resolves imports, evaluates expressions, and replaces them with actual values. The simplified lifecycle is below.

```mermaid
graph TD
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

code snippets in diagrams seem to be broken for me
image

// Output code

const fn = () => {
function require(moduleId) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
function require(moduleId) {
function __require_wd40_module(moduleId) {

Have I understood this require correctly?

}

__wd40_module('/theme.ts', function (module, exports, require) {
exports.spacing = value => value + 'px';
Copy link
Member

@ling1726 ling1726 Nov 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this example uses a simple function transform for spacing. If this function became more complicated to use imported constants do I understand correctly that it could become:

__wd40_module('theme.ts', function (module, exports, require) {
  exports.spacing = value => {
    value + require('/contants.ts').PX;
  };
});

if the export used other functions in the module, would the tree shaking also include those functions in the wd40 module?

exports.spacing = value => value + 'px';
});

__wd40_module('/input.ts?entrypoint', function (module, exports, require) {
Copy link
Member

@ling1726 ling1726 Nov 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it would certainly be interesting to cache the output of the input files and their dep trees to enable incremental builds on style files for large codebases.

However, if fully using Rust will be so blazing fast, it might not be necessary 🤔


We will introduce a strict mode that disables certain features to enhance compiler speed. For instance, this mode will not support `export *` and CommonJS modules.

**This mode will be implemented first.**
Copy link
Member

@ling1726 ling1726 Nov 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might want to make it clear that non-performant syntaxes like export * will be supported eventually or if that will be indefinitely not supported?


#### Resolver

A fast resolver is also necessary. While we will still offer the option to use a resolver from a bundler (e.g., Webpack, Vite, etc.), we will primarily use the OXC resolver, which is 28x faster than Webpack's resolver.
Copy link
Member

@ling1726 ling1726 Nov 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is the motivation for an option that uses a bundler's resolver?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants