Skip to content

Update warp version#1624

Merged
ktangsali merged 4 commits intoNVIDIA:mainfrom
ktangsali:update-warp-version
May 6, 2026
Merged

Update warp version#1624
ktangsali merged 4 commits intoNVIDIA:mainfrom
ktangsali:update-warp-version

Conversation

@ktangsali
Copy link
Copy Markdown
Collaborator

PhysicsNeMo Pull Request

Description

This PR fixes the issue with latest version of warp, but forgets to update it in the TOML file .

#1564

Checklist

Dependencies

Review Process

All PRs are reviewed by the PhysicsNeMo team before merging.

Depending on which files are changed, GitHub may automatically assign a maintainer for review.

We are also testing AI-based code review tools (e.g., Greptile), which may add automated comments with a confidence score.
This score reflects the AI’s assessment of merge readiness and is not a qualitative judgment of your work, nor is
it an indication that the PR will be accepted / rejected.

AI-generated feedback should be reviewed critically for usefulness.
You are not required to respond to every AI comment, but they are intended to help both authors and reviewers.
Please react to Greptile comments with 👍 or 👎 to provide feedback on their accuracy.

@copy-pr-bot
Copy link
Copy Markdown

copy-pr-bot Bot commented May 6, 2026

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps Bot commented May 6, 2026

Greptile Summary

This PR completes the warp-lang version update begun in #1564 by bumping the minimum version requirement to >=1.11.0 in pyproject.toml and all example requirements.txt files, and clearing now-obsolete warp.context deprecation warning outputs from tutorial notebooks.

  • pyproject.toml / uv.lock: warp-lang minimum raised from >=1.5.0 to >=1.11.0; the lock file resolves to 1.12.1 and gains sha256 hashes for torch 2.11.0 wheels (supply-chain hardening).
  • requirements.txt files (5 examples): All independently-pinned warp-lang floors aligned to >=1.11.0, including one that previously had no version constraint at all (domino).
  • Jupyter notebooks (4 files): Stale cell outputs containing warp.context deprecation warnings removed; no source cell changes.

Important Files Changed

Filename Overview
pyproject.toml Bumps warp-lang minimum from >=1.5.0 to >=1.11.0 with an explanatory comment; correctly aligns with the API changes in warp 1.11/1.13.
uv.lock Updates warp-lang specifier to >=1.11.0 (resolves to 1.12.1), and refreshes torch 2.11.0 wheel entries to include sha256 hashes—an improvement for supply-chain integrity.
examples/cfd/darcy_fno/requirements.txt Bumps warp-lang from >=1.6.0 to >=1.11.0, consistent with the project-wide version update.
examples/cfd/external_aerodynamics/domino/requirements.txt Adds >=1.11.0 constraint to a previously unpinned warp-lang entry.
examples/minimal/datapipes/tutorial_1_getting_started.ipynb Clears stale warp.context DeprecationWarning outputs from notebook cells; no source code changes.
examples/minimal/mesh/tutorial_7_domain_mesh.ipynb Removes warp.context DeprecationWarning from a mixed output cell; stderr output is preserved.

Reviews (1): Last reviewed commit: "update warp version" | Re-trigger Greptile

@coreyjadams
Copy link
Copy Markdown
Collaborator

Warp 1.13.0 is out, and ships support for Rapids memory manager. This would in principle give us pretty good savings in memory for everywhere we use warp for something memory intensive, since we can then share with pytorch.

I'd propose we shift this to 1.13 for a baseline instead of 1.11 unless there is a reason to hold at 1.11?

@ktangsali
Copy link
Copy Markdown
Collaborator Author

/blossom-ci

@ktangsali
Copy link
Copy Markdown
Collaborator Author

Warp 1.13.0 is out, and ships support for Rapids memory manager. This would in principle give us pretty good savings in memory for everywhere we use warp for something memory intensive, since we can then share with pytorch.

I'd propose we shift this to 1.13 for a baseline instead of 1.11 unless there is a reason to hold at 1.11?

That is a good point. My only hesitation, 1.13 was launched only a few days ago 😅 I am open either ways. 1.11 was chosen because that is the first instance where wp.Device becomes available.

Copy link
Copy Markdown
Collaborator

@coreyjadams coreyjadams left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@ktangsali ktangsali force-pushed the update-warp-version branch from 17386b8 to ed9f4e4 Compare May 6, 2026 21:00
@ktangsali
Copy link
Copy Markdown
Collaborator Author

/ok to test a1e7404

@ktangsali
Copy link
Copy Markdown
Collaborator Author

GitHub CI passed. Blossom CI is down. Merging.

@ktangsali ktangsali merged commit d0aebb0 into NVIDIA:main May 6, 2026
5 checks passed
@ktangsali ktangsali deleted the update-warp-version branch May 6, 2026 22:09
ktangsali added a commit to ktangsali/physicsnemo-cfd that referenced this pull request May 7, 2026
ktangsali added a commit to NVIDIA/physicsnemo-cfd that referenced this pull request May 8, 2026
* test nim inference workflow

* update toml file after this merge: NVIDIA/physicsnemo#1624

* compress images

* clean up debug statements

* remove references to NGC artifacts

* add a github ci to run pre-commit checks

* address review feedback

* black formatting

* add cleaned up notebooks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants