chore: sync upstream v1.11.7#8
Conversation
## Summary A supply chain security audit of the dbt-databricks CI/CD pipeline identified **8 findings** across GitHub Actions workflows. This PR addresses the critical and medium severity issues: - **86% of GitHub Action references** used mutable tags (`@v4`, `@v5`) instead of immutable commit SHAs — an attacker who compromises an upstream action repo can silently change what code runs in CI, including jobs with access to Databricks secrets - **`pypa/hatch@install`** had zero version pinning (a branch ref, not even a tag) and ran in 6 jobs - **`conventional-commits-parser`** npm package was installed without any version constraint - **Dependabot config** was in the wrong directory (`.github/ISSUE_TEMPLATE/dependabot.yml`) and was completely inactive - **No lock file** — `uv.lock` was in `.gitignore`, making CI builds non-deterministic ## Changes ### 1. Pin all GitHub Actions to immutable commit SHAs - Replaced all 32 mutable tag references across `main.yml`, `integration.yml`, `ci-pr-linting.yml`, and `coverage.yml` with full 40-character commit SHAs - All SHAs verified via `gh api` to predate March 18, 2026 - `stale.yml` was already SHA-pinned — no change needed - Trailing comments preserve the original tag for readability (e.g., `actions/checkout@34e11487... # v4`) **SHA Reference:** | Action | SHA | Commit Date | |--------|-----|-------------| | `actions/checkout` | `34e11487...` | 2025-11-13 | | `actions/setup-python` | `a26af69b...` | 2025-04-24 | | `actions/setup-node` | `49933ea5...` | 2025-04-02 | | `actions/upload-artifact` | `ea165f8d...` | 2025-03-19 | | `astral-sh/setup-uv` | `38f3f104...` | 2024-11-30 | | `pypa/hatch` | `257e27e5...` | 2024-05-23 | | `py-cov-action/python-coverage-comment-action` | `7188638f...` | 2026-01-06 | ### 2. Pin conventional-commits-parser to v6.3.0 - npm versions are immutable (can't be republished with different content), so exact version pinning is sufficient - v6.3.0 published 2026-03-01 (before March 18 cutoff) - Note: Dependabot cannot auto-update this pin since it's an inline workflow install, not a `package.json` dependency — updates must be manual ### 3. Fix Dependabot config location and add github-actions ecosystem - Moved from `.github/ISSUE_TEMPLATE/dependabot.yml` (wrong — GitHub never reads this) to `.github/dependabot.yml` - Added `github-actions` ecosystem with weekly schedule so SHA-pinned actions get automatic update PRs - Kept `pip` ecosystem with daily schedule - No `npm` ecosystem — the npm dependency is an inline workflow install that Dependabot cannot track ### 4. Commit uv.lock and enforce frozen installs in CI - Removed `uv.lock` from `.gitignore` and committed the generated lock file (82 resolved packages) - Added `UV_FROZEN: "1"` environment variable to all 6 CI jobs (3 in `main.yml`, 3 in `integration.yml`) - When `UV_FROZEN=1` is set, uv refuses to install if the lock file doesn't match `pyproject.toml`, failing CI loudly instead of silently re-resolving - **Developer impact:** when changing dependencies in `pyproject.toml`, you must also run `uv lock` and commit the updated `uv.lock` ### 5. Enhance CODEOWNERS - Added `@jprakash-db` as a code owner - Added explicit `/.github/workflows/` rule to make CI security review governance explicit ## Files Modified | File | Change | |------|--------| | `.github/workflows/main.yml` | 14 action refs SHA-pinned, `UV_FROZEN: "1"` added to 3 jobs | | `.github/workflows/integration.yml` | 15 action refs SHA-pinned, `UV_FROZEN: "1"` added to 3 jobs | | `.github/workflows/ci-pr-linting.yml` | 2 action refs SHA-pinned, npm pinned to `@6.3.0` | | `.github/workflows/coverage.yml` | 1 action ref SHA-pinned | | `.github/ISSUE_TEMPLATE/dependabot.yml` | Deleted | | `.github/dependabot.yml` | Created (correct location) | | `.gitignore` | Removed `uv.lock` line | | `uv.lock` | Created (82 resolved packages) | | `.github/CODEOWNERS` | Added `@jprakash-db`, added workflow rule | ## Local Validation Results - **actionlint:** no new errors (only pre-existing warnings about `linux-ubuntu-latest` custom runner label) - **grep audit:** 0 unpinned action refs remaining - **SHA resolution:** all 7 SHAs verified via `gh api` - **All SHAs predate March 18, 2026** - **npm pin:** `conventional-commits-parser@6.3.0` installs and parses correctly - **Frozen mode:** env resolution, code-quality, and 719 unit tests all pass with `UV_FROZEN=1` - **Drift test:** frozen mode correctly fails when lock file is stale ## Test plan - [ ] CI workflows pass with SHA-pinned actions - [ ] `Check PR title format` workflow validates PR titles correctly with pinned npm package - [ ] Unit tests pass with `UV_FROZEN=1` and committed `uv.lock` - [ ] After merge: verify Dependabot activates (Security > Dependabot tab) for pip and github-actions ecosystems - [ ] After merge: verify correct reviewers are auto-requested on workflow file changes ## Audit Findings Addressed | # | Finding | Severity | Status | |---|---------|----------|--------| | 1 | `pypa/hatch@install` — no version constraint | HIGH | Fixed | | 2 | 19/22 GitHub Actions pinned to tags, not SHAs | MEDIUM | Fixed | | 3 | `npm install conventional-commits-parser` unpinned | MEDIUM | Fixed | | 4 | No `uv.lock` committed — non-deterministic CI | MEDIUM | Fixed | | 5 | Dependabot config in wrong directory | MEDIUM | Fixed | | 6 | 8/11 dev dependencies unpinned | LOW | Deferred (uv.lock covers this) | | 7 | Build verification tools unpinned | LOW | Deferred (uv.lock covers this) | | 8 | `contents: write` on unit test job | INFO | Acceptable — already scoped | --- JIRA: [PECOBLR-2368](https://databricks.atlassian.net/browse/PECOBLR-2368) This pull request was AI-assisted by Isaac. [PECOBLR-2368]: https://databricks.atlassian.net/browse/PECOBLR-2368?atlOrigin=eyJpIjoiNWRkNTljNzYxNjVmNDY3MDlhMDU5Y2ZhYzA5YTRkZjUiLCJwIjoiZ2l0aHViLWNvbS1KU1cifQ
…bricks#1375) ## Summary Replace the `conventional-commits-parser` npm package in `ci-pr-linting.yml` with a pure bash regex, eliminating the entire Node.js/npm supply chain from the CI pipeline. ## Why The workflow installed `conventional-commits-parser@6.3.0` via `npm install --global` at CI time. While the direct package was version-pinned, its **transitive npm dependencies** use semver ranges and are resolved fresh on every CI run. This means a compromised transitive dependency (e.g., via a supply chain worm like CanisterWorm) could be silently pulled into CI without any lock file to prevent it. The parser was used only to check whether a `type` field exists in the PR title — no other parsed fields (scope, subject, body, footer) were used. This makes the entire Node.js toolchain (actions/checkout, actions/setup-node, npm install, conventional-commits-parser, jq) replaceable with a single bash regex. ## What changed **File:** `.github/workflows/ci-pr-linting.yml` **Removed (3 steps + 1 action):** - `actions/checkout` — not needed since no repo code is referenced - `actions/setup-node` — no longer need Node.js - `npm install --global conventional-commits-parser@6.3.0` — the dependency being eliminated **Modified (1 step):** - "Validate PR title" — replaced npm parser + jq pipeline with bash `[[ =~ ]]` regex match **Kept unchanged (2 steps):** - "Add comment to warn user" — sticky comment on failure (unchanged) - "Delete a previous comment when the issue has been resolved" — cleanup on success (unchanged) ## Regex equivalence Original parser regex: `^(\w*)!?(?:\(([\w\$\.\-\* ]*)\))?\: (.*)$` Replacement bash regex: `^[a-zA-Z]+!?(\([^)]*\))?\: .+` | Feature | Original | Replacement | Difference | |---------|----------|-------------|------------| | Type | `(\w*)` zero or more | `[a-zA-Z]+` one or more letters | Stricter — rejects empty or numeric-only types | | Breaking change `!` | `!?` | `!?` | Identical | | Optional scope | `(?:\(([\w\$\.\-\* ]*)\))?` | `(\([^)]*\))?` | Slightly more permissive on scope chars, but scope content was never validated | | Separator | `\: ` | `\: ` | Identical | | Description | `(.*)$` zero or more | `.+` one or more | Stricter — requires at least one char after `: ` | Net effect: the replacement is slightly **stricter** in two beneficial ways. ## Supply chain impact | Before | After | |--------|-------| | 4 GitHub Actions used | 2 GitHub Actions used | | Node.js 20 runtime required | No additional runtime | | npm install pulls transitive deps fresh each run | No npm dependencies | | jq required to parse JSON output | No JSON parsing | | postinstall scripts execute at install | No install step | ## Validation - 17/17 regex test cases pass (9 valid titles, 8 invalid titles) - actionlint: no new errors - 0 unpinned action refs - Sticky comment behavior unchanged (uses `if: failure()` / `if: success()` on job status) ## Test plan - [ ] PR title linting workflow passes on this PR itself - [ ] Open a test PR with invalid title — verify sticky comment appears - [ ] Fix the title — verify sticky comment is deleted --- JIRA: [PECOBLR-2368](https://databricks.atlassian.net/browse/PECOBLR-2368) This pull request was AI-assisted by Isaac. [PECOBLR-2368]: https://databricks.atlassian.net/browse/PECOBLR-2368?atlOrigin=eyJpIjoiNWRkNTljNzYxNjVmNDY3MDlhMDU5Y2ZhYzA5YTRkZjUiLCJwIjoiZ2l0aHViLWNvbS1KU1cifQ
…bricks#1384) Switch all workflows to databricks-protected-runner-group with linux-ubuntu-latest-hardened labels. Add JFrog OIDC authentication via a reusable composite action and configure uv to use JFrog as PyPI proxy for workflows that install Python packages.
### Description Fork PRs cannot authenticate to JFrog (no OIDC token available). This adds a cache-based dependency strategy so fork PRs get full CI feedback. - Add warmDepsCache.yml: trusted workflow that downloads all deps via JFrog and saves to GitHub Actions cache (triggers on push to main, daily schedule, and manual dispatch with optional PR number) - Add setup-python-deps composite action: restores cached deps and enables offline mode (UV_OFFLINE + PIP_NO_INDEX) - Update main.yml to use setup-python-deps instead of setup-jfrog-pypi, remove id-token:write permission (no longer needed) ### Checklist - [x] I have run this code in development and it appears to resolve the stated issue - [ ] This PR includes tests, or tests are not required/relevant for this PR - NA - [ ] I have updated the `CHANGELOG.md` and added information about my change to the "dbt-databricks next" section. - NA
## Summary - Fix `pre-commit: command not found` in warmDepsCache — pre-commit is inside hatch's default env, use `hatch run pre-commit install-hooks` - Fix UV cache path mismatch — `setup-uv` overrides `UV_CACHE_DIR` to a temp path, but cache save/restore uses `~/.cache/uv`. Pin `UV_CACHE_DIR: /home/runner/.cache/uv` in job env so `setup-uv` respects it and paths align. ## Test plan > Note: I had done a temp change to trigger `warmDepsCache` on the PR for testing - [x] Verify warmDepsCache run succeeds [Ref](https://github.com/databricks/dbt-databricks/actions/runs/24276657026/job/70891669261?pr=1389) - [x] Verify that we have correct artifacts uploaded - confirmed using github API - [x] All existing tests pass
## Summary Fixes issues preventing the dependency cache (databricks#1386) from working on fork PRs: - **uv cache path**: `setup-uv` overrode the cache dir to a temp path. Use `cache-local-path` to pin it. - **uv offline lookups**: cache is scoped by index URL hash. Set `UV_INDEX_URL` to match the warmer's JFrog URL. - **Pre-commit broken symlinks**: warmer's Python path differed from consumer's. Use `setup-python` in warmer. - **pip offline**: `PIP_NO_INDEX` blocks all index access. Create a pip wheelhouse and use `PIP_FIND_LINKS`. - **Version-specific wheels**: create test envs for all matrix versions in the warmer. - Misc: removed no-op `strategy.fail-fast`, added missing `id: coverage_comment`. ## Test plan - [x] `warm-cache` job succeeds on this PR - [x] All `main.yml` jobs pass offline
## Summary - Sets `open-pull-requests-limit: 0` on both pip and github-actions ecosystems, which disables routine version-bump PRs while still allowing security update PRs (they bypass this limit) - Changes pip scanning interval from daily to weekly since it only matters for security scanning cadence now ## Context Closed 9 open dependabot PRs that were all routine version bumps with no security motivation. This config change prevents future noise.
…atabricks#1355) Resolves databricks#1354 ### Description Fixes a timing bug where `table_format='iceberg'` (and other capability-gated features) fail with "iceberg requires DBR 14.3+" when the model targets a named compute via `databricks_compute` model config, even though the compute supports the required DBR version. ### Fix One-line change: call `_cache_dbr_capabilities()` eagerly in `_create_fresh_connection()` before reading the cache. This ensures the DBR version is queried and cached at connection creation time rather than waiting for the lazy open(). The call is idempotent (guarded by if `http_path` not in cache), so for default compute with a warm cache it's a no-op. ### Key changes - `dbt/adapters/databricks/connections.py`: Added `self._cache_dbr_capabilities(creds, conn.http_path)` before `conn.capabilities = self._get_capabilities_for_http_path(conn.http_path)` in `_create_fresh_connection()` - `tests/unit/test_connection_manager.py`: Added 3 unit tests validating eager caching for named compute, fallback behavior when cache is empty, and idempotency for default compute with a warm cache ### Checklist - [x] I have run this code in development and it appears to resolve the stated issue - [x] This PR includes tests, or tests are not required/relevant for this PR - [ ] I have updated the `CHANGELOG.md` and added information about my change to the "dbt-databricks next" section. --------- Signed-off-by: trouze <tyler@tylerrouze.com> Co-authored-by: Shubham Dhal <shubham.dhal@databricks.com>
…b run (databricks#1321) <!-- Please review our pull request review process in CONTRIBUTING.md before your proceed. --> Resolves databricks#1320 <!--- Include the number of the issue addressed by this PR above if applicable. Example: resolves databricks#1234 Please review our pull request review process in CONTRIBUTING.md before your proceed. --> ### Description Add feature to install packages in Notebook-scoped environment for PythonCommandSubmitt and PythonNotebookUploader classes #### Execution Test I made the changes and tested it in our environment, looking that the compiled code now have the prepended package installation ##### All purpose cluster <img width="1240" height="740" alt="image" src="https://github.com/user-attachments/assets/4c75aae2-6740-49b5-9e6e-d7676a6aac02" /> ##### Serverless cluster <img width="1228" height="716" alt="image" src="https://github.com/user-attachments/assets/1d91cae9-6cdf-4836-a86f-18d6156c5f99" /> #### Job cluster <img width="1241" height="353" alt="image" src="https://github.com/user-attachments/assets/546480f2-ee4c-47de-87af-9cb6adf9f851" /> ### Checklist - [x] I have run this code in development and it appears to resolve the stated issue - [x] This PR includes tests, or tests are not required/relevant for this PR - [x] I have updated the `CHANGELOG.md` and added information about my change to the "dbt-databricks next" section. --------- Signed-off-by: Federico Manuel Gomez Peter <federico.gomez@payclip.com> Co-authored-by: tejassp-db <241722411+tejassp-db@users.noreply.github.com> Co-authored-by: Shubham Dhal <shubham.dhal@databricks.com>
databricks#1338) (databricks#1364) resolves databricks#1360 ### description `WorkflowJobApi.create()` passes plain dicts to `workspace_client.jobs.create(**converted_job_spec)`. the databricks sdk's `jobs.create()` iterates tasks calling `v.as_dict()` on each (confirmed via `JobsAPI.create` source inspection). plain dicts have no `.as_dict()` which caused the `'dict' object has no attribute 'as_dict'` error. 1. the fix uses `JobSettings.from_dict()` + `.as_shallow_dict()` to convert the plain dict job spec into proper sdk dataclasses before calling `jobs.create()`. 2. `as_shallow_dict()` is the correct choice here : it unpacks top-level fields as kwargs (matching the `jobs.create()` signature) while keeping nested values as typed `Task` / `NotebookTask` / etc. dataclasses that satisfy `.as_dict()`. 3. this is the same pattern already used by `update_job_settings()` in the same class. **changes:** - `dbt/adapters/databricks/api_client.py` - `WorkflowJobApi.create()` now deserializes the dict via `JobSettings.from_dict()` and passes `.as_shallow_dict()` to the sdk. inline comment documents the sdk internals that make this the correct choice. - `tests/unit/api_client/test_workflow_job_api.py` - tests strengthened to assert `isinstance(task, Task)`, call `.as_dict()` on each task and verify output (directly proving the original `AttributeError` cannot recur), and add a new `test_create__invalid_job_spec_raises` covering the `JobSettings.from_dict()` error path. ### checklist - [x] i have run this code in development and it appears to resolve the stated issue - [x] this pr includes tests, or tests are not required/relevant for this pr - [x] i have updated the `CHANGELOG.md` and added information about my change to the "dbt-databricks next" section. --------- Signed-off-by: aarushisingh04 <aarushi07.singh@gmail.com> Co-authored-by: tejassp-db <241722411+tejassp-db@users.noreply.github.com> Co-authored-by: Shubham Dhal <shubham.dhal@databricks.com>
…ll profiles (databricks#1401) ## Summary - Fix `TestWorkflowJob` functional test that was **never actually running** due to three compounding bugs: 1. `skip_profile` listed all 3 available profiles (`databricks_cluster`, `databricks_uc_sql_endpoint`, `databricks_uc_cluster`) — test was skipped everywhere 2. Used `simple_python_model` which hardcodes `submission_method='serverless_cluster'` in `dbt.config()`, overriding the YAML's `workflow_job` setting — test was exercising the wrong code path 3. `workflow_schema` included `max_retries` which is not a valid `jobs.create()` parameter resolves follow-up to databricks#1360 ### changes - Add `workflow_python_model` fixture without `submission_method` in `dbt.config()` so the YAML schema's `submission_method: workflow_job` takes effect - Remove `databricks_uc_cluster` from skip list so the test runs on at least one profile - Remove invalid `max_retries` from `workflow_schema` ### checklist - [x] i have run this code in development and it appears to resolve the stated issue - [x] this pr includes tests, or tests are not required/relevant for this pr - [x] i have updated the `CHANGELOG.md` and added information about my change to the "dbt-databricks next" section. ## Test plan - [x] Verified test was SKIPPED on all 3 profiles before the fix - [x] Verified test PASSES on `databricks_uc_cluster` profile after the fix (ran against live cluster)
…atabricks#1397) ## Summary - Sets `require_alias: bool = False` on `DatabricksRelation`, preventing `render_limited()` from injecting a `_dbt_limit_subq_*` alias that conflicts with user-provided `AS` aliases when running `dbt run --empty` - Adds conformance tests for `--empty` mode (`BaseTestEmpty` and `BaseTestEmptyInlineSourceRef`) - Adds a unit test verifying `render_limited()` output has no trailing subquery alias ## Context When `dbt run --empty` is used, dbt wraps `ref()` and `source()` in a zero-row subquery. `BaseRelation` defaults to `require_alias=True`, which appends an auto-generated alias (`_dbt_limit_subq_{table}`). If the user also has an `AS` alias in their SQL (e.g. `{{ ref('orders') }} AS orders`), the two aliases conflict and produce invalid SQL. Databricks SQL does not require subquery aliases, so `require_alias` should be `False`. Resolves dbt-labs/dbt-adapters#660 for Databricks. ## Test plan - [x] Unit test: `test_render_limited_with_empty_no_alias` — asserts `render_limited()` produces no trailing alias with `limit=0` - [x] Functional test: `TestDatabricksEmptyInlineSourceRef` — end-to-end against live cluster, `dbt run --empty` with inline source alias - [x] Functional test: `TestDatabricksEmpty` — general `--empty` conformance - [x] All 721 existing unit tests pass (0 regressions)
## Summary - Bump `dbt-core` upper bound from `<1.11.7` to `<1.11.9` to include 1.11.7 and 1.11.8 All changes reviewed — no breaking changes to adapter interfaces or dbt-databricks behavior. ## Test plan - [x] CI passes
….6 (databricks#1363) ## Summary - Bumps `databricks-sql-connector` upper bound from `<4.1.4` to `<4.1.6`, allowing users to install connector `4.1.5` - Connector `4.1.5` introduced [`_respect_server_retry_after_header`](databricks/databricks-sql-python#756), which users can now opt into via `connection_parameters` in `profiles.yml` ## Context Customers using `use_materialization_v2: true` experience duplicate rows when the server returns HTTP 503 after already committing an INSERT. The connector blindly retries, causing data to be written twice. With `_respect_server_retry_after_header: true`, retries only occur when the server explicitly sends a `Retry-After` header, preventing duplicate writes from infrastructure-level 503s. ## Test plan - [x] All 744 unit tests pass (`hatch run unit`) - [x] E2E verification with a dummy dbt project against UC SQL endpoint: - `dbt debug` + `dbt run` succeed with default profile (connector default `False` applied) - `dbt debug` + `dbt run` succeed with `_respect_server_retry_after_header: true` explicitly set --------- Co-authored-by: Shubham Dhal <shubham.dhal@databricks.com>
## Summary - Bump version `1.11.6` → `1.11.7` - nit: Add missing changelog entry for PR databricks#1355 (capability detection fix for named compute)
Merge upstream databricks/dbt-databricks v1.11.7 into our fork. Conflict resolution: replaced Databricks org protected runner group with ubuntu-latest in ci-pr-linting.yml, coverage.yml, and main.yml while keeping all other upstream changes (JFrog PyPI proxy, dependency caching, hardened GHA supply chain, UV_FROZEN env). Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Our fork cannot authenticate to databricks.jfrog.io or use the Databricks protected runner group. Remove setup-jfrog-pypi and setup-python-deps steps from CI workflows, use standard PyPI instead. Also replace protected runner group with ubuntu-latest in integration.yml. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Coverage reportClick to see where and how coverage changed
This report was generated by python-coverage-comment-action |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
There was a problem hiding this comment.
Pull request overview
Syncs upstream databricks/dbt-databricks tag v1.11.7 into this fork, bringing in upstream fixes/features (notebook-scoped Python packages, capability detection, workflow job spec conversion) plus CI hardening/caching changes and updated dependency bounds.
Changes:
- Add notebook-scoped Python package installation support for Python model submissions and document/test it.
- Fix/adjust DBR capability caching behavior and workflow job creation to convert dict specs into SDK dataclasses.
- Update CI/workflows (pinned actions, uv/Hatch caching helpers, permissions hardening) and bump dependency upper bounds/version metadata.
Reviewed changes
Copilot reviewed 30 out of 33 changed files in this pull request and generated 8 comments.
Show a summary per file
| File | Description |
|---|---|
dbt/adapters/databricks/python_models/python_submissions.py |
Adds notebook-scoped package install code-paths and library config changes for Python submissions |
dbt/adapters/databricks/python_models/python_config.py |
Introduces notebook_scoped_libraries config and a derived PythonPackagesConfig helper |
dbt/adapters/databricks/connections.py |
Adds _try_cache_dbr_capabilities and calls it during fresh connection creation |
dbt/adapters/databricks/api_client.py |
Converts workflow job specs via JobSettings.from_dict(...).as_shallow_dict() before jobs.create() |
dbt/adapters/databricks/relation.py |
Adds require_alias flag to relation dataclass (used by upstream empty-mode alias behavior) |
tests/unit/python/test_python_submitters.py |
Expands unit coverage for notebook-scoped package injection across submitters |
tests/unit/python/test_python_job_support.py |
Adds coverage for excluding packages from cluster-level libraries when notebook-scoped |
tests/unit/python/test_python_config.py |
Adds config parsing test for Python packages config |
tests/unit/test_connection_manager.py |
Adds unit tests to ensure DBR capability cache isn’t poisoned by None versions |
tests/unit/test_relation.py |
Adds regression test for render_limited() not adding an alias in empty/limit=0 mode |
tests/unit/api_client/test_workflow_job_api.py |
Updates tests to assert tasks are SDK Task objects and adds invalid-spec error coverage |
tests/functional/adapter/python_model/test_python_model.py |
Fixes workflow job functional test setup and adds functional tests for notebook-scoped packages |
tests/functional/adapter/python_model/fixtures.py |
Adds/updates Python model fixtures for workflow jobs and notebook-scoped packages scenarios |
tests/functional/adapter/empty/test_empty.py |
Adds functional tests for dbt run --empty behavior (including inline ref/source) |
tests/functional/adapter/empty/__init__.py |
Marks new functional test package |
docs/workflow-job-submission.md |
Documents Python package configuration and notebook-scoped behavior |
pyproject.toml |
Bumps databricks-sql-connector and dbt-core upper bounds |
dbt/adapters/databricks/__version__.py |
Bumps adapter version to 1.11.7 |
CHANGELOG.md |
Updates 1.11.7 release notes |
.github/workflows/main.yml |
Pins actions, uses uv/Hatch caching, sets UV_FROZEN, hardens permissions |
.github/workflows/integration.yml |
Pins actions and sets UV_FROZEN for integration workflows |
.github/workflows/coverage.yml |
Pins coverage-comment action by SHA |
.github/workflows/ci-pr-linting.yml |
Switches PR title linting to a bash regex approach |
.github/workflows/stale.yml |
Moves stale workflow to protected runner group |
.github/workflows/warmDepsCache.yml |
Adds dependency cache warming workflow (protected runner + uv/Hatch) |
.github/actions/setup-python-deps/action.yml |
Adds composite action to restore caches and enable offline mode |
.github/actions/setup-jfrog-pypi/action.yml |
Adds composite action to configure JFrog PyPI proxy via OIDC |
.github/dependabot.yml |
Adds dependabot config (security updates only) |
.github/ISSUE_TEMPLATE/dependabot.yml |
Removes old dependabot config from issue templates |
.github/CODEOWNERS |
Adds an additional codeowner + explicit workflow rule |
.github/last-upstream-sync-tag |
Updates last synced tag to v1.11.7 |
.gitignore |
Stops ignoring uv.lock (so lockfile can be committed) |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
iamfj
left a comment
There was a problem hiding this comment.
LGTM but not sure about the CI stuff, if that works in our fork
Most of the CI stuff is still databricks related. For now, only those entries required for this repository are adjusted. |
Summary
databricks/dbt-databrickstagv1.11.7into our forkubuntu-latestrunners, take all other upstream CI improvements (JFrog PyPI proxy, dependency caching, hardened GHA supply chain,UV_FROZENenv).github/last-upstream-sync-tagtov1.11.7Upstream Changes (v1.11.6 → v1.11.7)
databricks_compute) (fix: capability detection for named compute (databricks_compute) databricks/dbt-databricks#1355)Conflict Resolution
.github/workflows/ci-pr-linting.ymlubuntu-latestrunner.github/workflows/coverage.ymlubuntu-latestrunner.github/workflows/main.ymlubuntu-latestrunnerTags Created
v1.11.6-enhanced— points to pre-sync main (session mode on v1.11.6 base)v1.11.7-enhanced— points to this PR's HEAD (session mode on v1.11.7 base)Test Plan
upstream-sync.ymlworkflow still presentsession.py,credentials.py,connections.py)ubuntu-latestrunners🤖 Generated with Claude Code