### What
Render noindex into a flight data and rsc payload when page path is
`/404`
### Why
When it's static generation, noindex is not rendered due to the
statusCode from mock request is 200, but we can relying on the pagePath
as `/404` page should always contain `nonidex`
We were missing the noindex before for flight generation, now we'll
render it when it's 404 page.
`node_modules` gets ignored when deployed unless explicitly allowed, and
the regular install command will clobber what was in there.
This allowlists the `node_modules` directory and copies it into a new
folder, runs the install command, and then merges the patched
`node_modules` in so the patched modules are available in the test.
Only 1 test suite in this suite is failing and it's related to file
tracing, which should be fixed in a follow-up. For now this enables
deploy tests for everything else to make sure we don't regress.
This also simplifies the original test to not require stopping the
server & patching a file.
- Fixed redirects tests not working when deployed because they were
`POST` requests to a static page
- Skipped 404 test for a similar reason: a `POST` to the static not
found page is handled differently, and we won't have access to the
runtime logs anyway
- Refactored interception routes test to not rely on runtime logs
- Fixed revalidation test & removed comment about flakiness
<details>
<summary>Validated Run Summary</summary>
![CleanShot 2024-06-13 at 13 45
32@2x](https://github.com/vercel/next.js/assets/1939140/8b85cb60-b389-451c-b449-41067f86a8d3)
</details>
These 2 tests use an in-memory data store that won't be necessarily
shared across invocations of the lambda. This skips the tests that rely
on that functionality as testing it in `next start` should be
sufficient.
This test was only failing because `vercel logs` limit the output to 100
lines, and telemetry debugging was adding a lot of verbosity to the
build logs.
This bumps the log lines to a higher value to give some more breathing
room, and did a drive-by `check` -> `retry` refactor.
This disables tests that should not be run in a deployed environment,
because they use incompatible APIs or there's no reason to test them
outside of `next start`. Specifically disables for things like:
- Using `next.patchFile`, `next.renameFile`, etc.
- Attempting to use `next.cliOutput` to query runtime logs. When
deployed, these are only build-time logs.
[Latest Run](https://github.com/vercel/next.js/actions/runs/9483807368)
- `next.cliOutput` will only refer to build time logs, so this
particular assertion won't work
- Drive-by refactor for it to use `retry` instead of `check`
Verified this passes when deployed
[Test
Run](https://github.com/vercel/next.js/actions/runs/9471783416/job/26095882422)
Changes:
- Add a setup step that clears the project so it doesn't happen in each
runner
- Run when a release is published rather than on cron
- Notify via Slack when a failure occurs
- Leverage build_reusable for the test runner to match the
build_and_test workflow
- Fixes to `next-deploy` script: not properly logging/catching errors
- Adds manifest to ignore known issues
- Split into 6 runners with 2 concurrency (12 deploys at a time)
- Adds some logging so we know what's happening
- Disable Playwright trace mode (it kept failing to find a trace file
and cluttering the output. Don't think we need it here anyway)
<details>
- <summary>Removed noisy output</summary>
![CleanShot 2024-06-10 at 14 08
05@2x](https://github.com/vercel/next.js/assets/1939140/f227e71c-95b4-4859-90de-a23c88c55ea8)
</details>
<!-- Thanks for opening a PR! Your contribution is much appreciated.
To make sure your PR is handled as smoothly as possible we request that
you follow the checklist sections below.
Choose the right checklist for the change(s) that you're making:
## For Contributors
### Improving Documentation
- Run `pnpm prettier-fix` to fix formatting issues before opening the
PR.
- Read the Docs Contribution Guide to ensure your contribution follows
the docs guidelines:
https://nextjs.org/docs/community/contribution-guide
### Adding or Updating Examples
- The "examples guidelines" are followed from our contributing doc
https://github.com/vercel/next.js/blob/canary/contributing/examples/adding-examples.md
- Make sure the linting passes by running `pnpm build && pnpm lint`. See
https://github.com/vercel/next.js/blob/canary/contributing/repository/linting.md
### Fixing a bug
- Related issues linked using `fixes #number`
- Tests added. See:
https://github.com/vercel/next.js/blob/canary/contributing/core/testing.md#writing-tests-for-nextjs
- Errors have a helpful link attached, see
https://github.com/vercel/next.js/blob/canary/contributing.md
### Adding a feature
- Implements an existing feature request or RFC. Make sure the feature
request has been accepted for implementation before opening a PR. (A
discussion must be opened, see
https://github.com/vercel/next.js/discussions/new?category=ideas)
- Related issues/discussions are linked using `fixes #number`
- e2e tests added
(https://github.com/vercel/next.js/blob/canary/contributing/core/testing.md#writing-tests-for-nextjs)
- Documentation added
- Telemetry added. In case of a feature if it's used or not.
- Errors have a helpful link attached, see
https://github.com/vercel/next.js/blob/canary/contributing.md
## For Maintainers
- Minimal description (aim for explaining to someone not on the team to
understand the PR)
- When linking to a Slack thread, you might want to share details of the
conclusion
- Link both the Linear (Fixes NEXT-xxx) and the GitHub issues
- Add review comments if necessary to explain to the reviewer the logic
behind a change
### What?
### Why?
### How?
Closes NEXT-
Fixes #
-->