Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add integration testing tooling setup for JIT #3925

Closed

Conversation

thecrypticace
Copy link
Contributor

@thecrypticace thecrypticace commented Apr 4, 2021

Tools:

  • Hugo
  • Laravel Mix v6
  • Next
  • Nuxt
  • Parcel 2
  • PostCSS CLI
  • Snowpack
  • Vite
  • webpack 5 + postcss
  • webpack 5 + sass/less/etc…
  • Webpacker

Actions:

  • Initial Build
  • Add utility to html files
  • Add utility to vue files
  • Add utility to react/jsx files
  • Add @​apply
  • Update @​apply
  • Change config

There are a few things I have to figure out:

  • Proper process cleanup — if a process is spawned that is not an apparent child process Jest does not exit but killing the process that spawned it doesn't clean up either. This affects the Laravel Mix which is otherwise also "working".
  • Handling known / expected tooling failures

@thecrypticace thecrypticace changed the title Add simple integration testing tooling setup for JIT Add integration testing tooling setup for JIT Apr 4, 2021
@thecrypticace
Copy link
Contributor Author

thecrypticace commented Apr 4, 2021

I need to adjust the actions scripts to install npm dependencies for the tooling directories.

@RobinMalfait Do you know of / can think of a good way to handle these given that we're caching node deps here?

@RobinMalfait
Copy link
Member

RobinMalfait commented Apr 4, 2021

I think there are a few things you can do. We could create individual workflows for each tool. We could also setup steps for each tool.

If you look at the current GitHub actions, then you will see this:

- name: Use cached node_modules
  id: cache  # This is the ID used below
  uses: actions/cache@v2.1.1
  with:
    path: node_modules
    key: nodeModules-${{ hashFiles('**/package-lock.json') }}-${{ matrix.node-version }}
    restore-keys: |
      nodeModules-
- name: Install dependencies
  if: steps.cache.outputs.cache-hit != 'true'  # This is referring to steps.ID-OF-STEP.outputs.cache-hit
  run: npm install
  env:
    CI: true

So we could create a step for each tool, one way to do it is to define a matrix for each tool. We already do that for the node-version:

strategy:
  matrix:
    node-version: [12.x, 14.x]

Then we use that node-version inside the steps below. For example we can define a matrix for each node-version (currently there) and for each tool. Example: https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#example-running-with-more-than-one-operating-system

Note, this could result in a lot of combinations 😅

If we go with this approach, then we should also be able to run them in parallel (I think that's the default behaviour).
image

I think the only thing left to do is to also encode the tool in the key field when caching:

- name: Use cached node_modules
  id: cache-${{ matrix.tool }}  # This is the ID used below
  uses: actions/cache@v2.1.1
  with:
    path: node_modules
    key: nodeModules-${{ hashFiles('**/package-lock.json') }}-${{ matrix.node-version }}-${{ matrix.tool }}
    restore-keys: |
      nodeModules-
- name: Install dependencies
  if: steps.cache-${{ matrix-tool }}.outputs.cache-hit != 'true'  # This is referring to steps.ID-OF-STEP.outputs.cache-hit
  run: npm install
  env:
    CI: true

I haven't tested it, and we might need to change this approach if we have a lot of tools to test, but I think this can work. What do you think?

@thecrypticace
Copy link
Contributor Author

Yeah a separate GitHub action would work. We wouldn't necessarily need to to a matrix build (but could). It could end up being really easy to inch close to the 2,000 minute / month limit running all these integration tests and I want to be mindful of that.

@thecrypticace
Copy link
Contributor Author

Seems like CI is having problems with the main tests. They're taking a really really long time to even start up. Perhaps related to test coverage checking?

@adamwathan
Copy link
Member

Superseded by #4354 — thanks though @thecrypticace you are the best man ❤️

@adamwathan adamwathan closed this May 20, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants