Skip to content

Commit 7628c6e

Browse files
committed
docs(migration): update MIGRATION.md with clarified import mappings, format protocol, and v2 API changes
Signed-off-by: Yurii Serhiichuk <savik.ne@gmail.com>
1 parent 106691e commit 7628c6e

1 file changed

Lines changed: 87 additions & 69 deletions

File tree

MIGRATION.md

Lines changed: 87 additions & 69 deletions
Original file line numberDiff line numberDiff line change
@@ -1,47 +1,61 @@
11
# Migrating from CloudEvents SDK v1 to v2
22

3-
This guide covers the breaking changes and new patterns introduced in v2 of the CloudEvents Python SDK.
3+
This guide covers the breaking changes and new patterns introduced in v2 of the
4+
CloudEvents Python SDK.
45

56
## Requirements
67

7-
| | v1 | v2 |
8-
|---|---|---|
9-
| Python | 3.7+ | **3.10+** |
8+
| | v1 | v2 |
9+
|--------------|------------------------------------|-------------------------------|
10+
| Python | 3.7+ | **3.10+** |
1011
| Dependencies | varies (optional `pydantic` extra) | `python-dateutil>=2.8.2` only |
1112

1213
## Intermediate Step: `cloudevents.v1` Compatibility Layer
1314

14-
If you are not ready to migrate to the v2 core API, the `cloudevents.v1` package provides a drop-in compatibility layer that preserves the v1 API under a new namespace. This lets you unpin from the old top-level imports without rewriting your event-handling logic.
15+
If you are not ready to migrate to the v2 core API, the `cloudevents.v1` package
16+
provides a drop-in compatibility layer that preserves the v1 API under a new namespace.
17+
This lets you unpin from the old top-level imports without rewriting your event-handling
18+
logic.
1519

1620
Swap the old top-level imports for their `cloudevents.v1.*` equivalents:
1721

18-
| Old import | Compat layer import |
19-
|---|---|
20-
| `from cloudevents.http import CloudEvent` | `from cloudevents.v1.http import CloudEvent` |
21-
| `from cloudevents.http import from_http` | `from cloudevents.v1.http import from_http` |
22-
| `from cloudevents.http import from_json` | `from cloudevents.v1.http import from_json` |
23-
| `from cloudevents.http import from_dict` | `from cloudevents.v1.http import from_dict` |
24-
| `from cloudevents.conversion import to_binary` | `from cloudevents.v1.http import to_binary` |
25-
| `from cloudevents.conversion import to_structured` | `from cloudevents.v1.http import to_structured` |
26-
| `from cloudevents.conversion import to_json` | `from cloudevents.v1.http import to_json` |
27-
| `from cloudevents.conversion import to_dict` | `from cloudevents.v1.conversion import to_dict` |
28-
| `from cloudevents.kafka import KafkaMessage` | `from cloudevents.v1.kafka import KafkaMessage` |
29-
| `from cloudevents.kafka import to_binary` | `from cloudevents.v1.kafka import to_binary` |
30-
| `from cloudevents.kafka import from_binary` | `from cloudevents.v1.kafka import from_binary` |
31-
| `from cloudevents.pydantic import CloudEvent` | `from cloudevents.v1.pydantic import CloudEvent` |
32-
33-
The compat layer behaviour is identical to the old v1 SDK: events are dict-like and mutable, marshallers/unmarshallers are accepted as callables, and `is_binary`/`is_structured` helpers are still available. The compat layer does **not** enforce strict mypy and is not under the v2 validation rules.
22+
| Old import | Compat layer import |
23+
|----------------------------------------------------|--------------------------------------------------|
24+
| `from cloudevents.http import CloudEvent` | `from cloudevents.v1.http import CloudEvent` |
25+
| `from cloudevents.http import from_http` | `from cloudevents.v1.http import from_http` |
26+
| `from cloudevents.http import from_json` | `from cloudevents.v1.http import from_json` |
27+
| `from cloudevents.http import from_dict` | `from cloudevents.v1.http import from_dict` |
28+
| `from cloudevents.conversion import to_binary` | `from cloudevents.v1.http import to_binary` |
29+
| `from cloudevents.conversion import to_structured` | `from cloudevents.v1.http import to_structured` |
30+
| `from cloudevents.conversion import to_json` | `from cloudevents.v1.http import to_json` |
31+
| `from cloudevents.conversion import to_dict` | `from cloudevents.v1.conversion import to_dict` |
32+
| `from cloudevents.kafka import KafkaMessage` | `from cloudevents.v1.kafka import KafkaMessage` |
33+
| `from cloudevents.kafka import to_binary` | `from cloudevents.v1.kafka import to_binary` |
34+
| `from cloudevents.kafka import from_binary` | `from cloudevents.v1.kafka import from_binary` |
35+
| `from cloudevents.pydantic import CloudEvent` | `from cloudevents.v1.pydantic import CloudEvent` |
36+
37+
The compat layer behaviour is identical to the old v1 SDK: events are dict-like and
38+
mutable, marshallers/unmarshallers are accepted as callables, and `is_binary`/
39+
`is_structured` helpers are still available. The compat layer does **not** enforce
40+
strict mypy and is not under the v2 validation rules.
3441

3542
When you are ready to move fully to v2, follow the rest of this guide.
3643

3744
## Architectural Changes
3845

3946
v2 is a ground-up rewrite with four fundamental shifts:
4047

41-
1. **Protocol-based design** -- `BaseCloudEvent` is a `Protocol`, not a base class. Events expose explicit getter methods instead of dict-like access.
42-
2. **Explicit serialization** -- Implicit JSON handling with marshaller callbacks is replaced by a `Format` protocol. `JSONFormat` is the built-in implementation; you can write your own.
43-
3. **Same auto-generated attributes** -- Like v1, v2 auto-generates `id` (UUID4), `time` (UTC now), and `specversion` (`"1.0"` or `"0.3"`) if omitted. Only `type` and `source` are strictly required.
44-
4. **Strict validation** -- Events are validated at construction time. Extension attribute names must be 1-20 lowercase alphanumeric characters. `time` must be a timezone-aware `datetime`.
48+
1. **Protocol-based design** -- `BaseCloudEvent` is a `Protocol`, not a base class.
49+
Events expose explicit getter methods instead of dict-like access.
50+
2. **Explicit serialization** -- Implicit JSON handling with marshaller callbacks is
51+
replaced by a `Format` protocol. `JSONFormat` is the built-in implementation; you can
52+
write your own.
53+
3. **Same auto-generated attributes** -- Like v1, v2 auto-generates `id` (UUID4),
54+
`time` (UTC now), and `specversion` (`"1.0"` or `"0.3"`) if omitted. Only `type` and
55+
`source` are strictly required.
56+
4. **Strict validation** -- Events are validated at construction time. Extension
57+
attribute names must be 1-20 lowercase alphanumeric characters. `time` must be a
58+
timezone-aware `datetime`.
4559

4660
## Creating Events
4761

@@ -71,7 +85,8 @@ event = CloudEvent(
7185

7286
## Accessing Event Attributes
7387

74-
v1 events were dict-like. v2 events use explicit getter methods and are immutable after construction.
88+
v1 events were dict-like. v2 events use explicit getter methods and are immutable after
89+
construction.
7590

7691
**v1:**
7792

@@ -272,30 +287,31 @@ event = from_http(headers, body, data_unmarshaller=yaml.safe_load)
272287
from cloudevents.core.formats.base import Format
273288
from cloudevents.core.base import BaseCloudEvent, EventFactory
274289

290+
275291
class YAMLFormat:
276292
"""Example custom format -- implement the Format protocol."""
277293

278294
def read(
279-
self,
280-
event_factory: EventFactory | None,
281-
data: str | bytes,
295+
self,
296+
event_factory: EventFactory | None,
297+
data: str | bytes,
282298
) -> BaseCloudEvent:
283299
... # Parse YAML into attributes dict, call event_factory(attributes, data)
284300

285301
def write(self, event: BaseCloudEvent) -> bytes:
286302
... # Serialize entire event to YAML bytes
287303

288304
def write_data(
289-
self,
290-
data: dict | str | bytes | None,
291-
datacontenttype: str | None,
305+
self,
306+
data: dict | str | bytes | None,
307+
datacontenttype: str | None,
292308
) -> bytes:
293309
... # Serialize just the data payload
294310

295311
def read_data(
296-
self,
297-
body: bytes,
298-
datacontenttype: str | None,
312+
self,
313+
body: bytes,
314+
datacontenttype: str | None,
299315
) -> dict | str | bytes | None:
300316
... # Deserialize just the data payload
301317

@@ -331,12 +347,12 @@ from cloudevents.exceptions import (
331347

332348
```python
333349
from cloudevents.core.exceptions import (
334-
BaseCloudEventException, # Base for all CloudEvent errors
335-
CloudEventValidationError, # Aggregated validation errors (raised on construction)
336-
MissingRequiredAttributeError, # Missing required attribute (also a ValueError)
337-
InvalidAttributeTypeError, # Wrong attribute type (also a TypeError)
338-
InvalidAttributeValueError, # Invalid attribute value (also a ValueError)
339-
CustomExtensionAttributeError, # Invalid extension name (also a ValueError)
350+
BaseCloudEventException, # Base for all CloudEvent errors
351+
CloudEventValidationError, # Aggregated validation errors (raised on construction)
352+
MissingRequiredAttributeError, # Missing required attribute (also a ValueError)
353+
InvalidAttributeTypeError, # Wrong attribute type (also a TypeError)
354+
InvalidAttributeValueError, # Invalid attribute value (also a ValueError)
355+
CustomExtensionAttributeError, # Invalid extension name (also a ValueError)
340356
)
341357
```
342358

@@ -353,35 +369,35 @@ except CloudEventValidationError as e:
353369

354370
## Removed Features
355371

356-
| Feature | v1 | v2 Alternative |
357-
|---|---|---|
358-
| Pydantic integration | `from cloudevents.pydantic import CloudEvent` | Removed -- use the core `CloudEvent` directly |
359-
| Dict-like event access | `event["source"]`, `event["x"] = y` | `event.get_source()`, `event.get_extension("x")` |
360-
| `from_dict()` | `from cloudevents.http import from_dict` | Construct `CloudEvent(attributes=d)` directly |
361-
| `to_dict()` | `from cloudevents.conversion import to_dict` | `event.get_attributes()` + `event.get_data()` |
362-
| `from_json()` | `from cloudevents.http import from_json` | `JSONFormat().read(None, json_bytes)` |
363-
| `to_json()` | `from cloudevents.conversion import to_json` | `JSONFormat().write(event)` |
364-
| Custom marshallers | `data_marshaller=fn` / `data_unmarshaller=fn` | Implement the `Format` protocol |
365-
| `is_binary()` / `is_structured()` | `from cloudevents.http import is_binary` | Mode is handled internally by `from_http_event()` |
366-
| Deprecated helpers | `to_binary_http()`, `to_structured_http()` | `to_binary_event()`, `to_structured_event()` |
372+
| Feature | v1 | v2 Alternative |
373+
|-----------------------------------|-----------------------------------------------|---------------------------------------------------|
374+
| Pydantic integration | `from cloudevents.pydantic import CloudEvent` | Removed -- use the core `CloudEvent` directly |
375+
| Dict-like event access | `event["source"]`, `event["x"] = y` | `event.get_source()`, `event.get_extension("x")` |
376+
| `from_dict()` | `from cloudevents.http import from_dict` | Construct `CloudEvent(attributes=d)` directly |
377+
| `to_dict()` | `from cloudevents.conversion import to_dict` | `event.get_attributes()` + `event.get_data()` |
378+
| `from_json()` | `from cloudevents.http import from_json` | `JSONFormat().read(None, json_bytes)` |
379+
| `to_json()` | `from cloudevents.conversion import to_json` | `JSONFormat().write(event)` |
380+
| Custom marshallers | `data_marshaller=fn` / `data_unmarshaller=fn` | Implement the `Format` protocol |
381+
| `is_binary()` / `is_structured()` | `from cloudevents.http import is_binary` | Mode is handled internally by `from_http_event()` |
382+
| Deprecated helpers | `to_binary_http()`, `to_structured_http()` | `to_binary_event()`, `to_structured_event()` |
367383

368384
## Quick Reference: Import Mapping
369385

370-
| v1 Import | v2 Import |
371-
|---|---|
372-
| `cloudevents.http.CloudEvent` | `cloudevents.core.v1.event.CloudEvent` |
373-
| `cloudevents.http.from_http` | `cloudevents.core.bindings.http.from_http_event` |
374-
| `cloudevents.http.from_json` | `cloudevents.core.formats.json.JSONFormat().read` |
375-
| `cloudevents.http.from_dict` | `cloudevents.core.v1.event.CloudEvent(attributes=...)` |
376-
| `cloudevents.conversion.to_binary` | `cloudevents.core.bindings.http.to_binary_event` |
377-
| `cloudevents.conversion.to_structured` | `cloudevents.core.bindings.http.to_structured_event` |
378-
| `cloudevents.conversion.to_json` | `cloudevents.core.formats.json.JSONFormat().write` |
379-
| `cloudevents.conversion.to_dict` | `event.get_attributes()` |
380-
| `cloudevents.kafka.KafkaMessage` | `cloudevents.core.bindings.kafka.KafkaMessage` |
381-
| `cloudevents.kafka.to_binary` | `cloudevents.core.bindings.kafka.to_binary_event` |
382-
| `cloudevents.kafka.from_binary` | `cloudevents.core.bindings.kafka.from_binary_event` |
383-
| `cloudevents.pydantic.CloudEvent` | Removed |
384-
| `cloudevents.abstract.AnyCloudEvent` | `cloudevents.core.base.BaseCloudEvent` |
386+
| v1 Import | v2 Import |
387+
|----------------------------------------|--------------------------------------------------------|
388+
| `cloudevents.http.CloudEvent` | `cloudevents.core.v1.event.CloudEvent` |
389+
| `cloudevents.http.from_http` | `cloudevents.core.bindings.http.from_http_event` |
390+
| `cloudevents.http.from_json` | `cloudevents.core.formats.json.JSONFormat().read` |
391+
| `cloudevents.http.from_dict` | `cloudevents.core.v1.event.CloudEvent(attributes=...)` |
392+
| `cloudevents.conversion.to_binary` | `cloudevents.core.bindings.http.to_binary_event` |
393+
| `cloudevents.conversion.to_structured` | `cloudevents.core.bindings.http.to_structured_event` |
394+
| `cloudevents.conversion.to_json` | `cloudevents.core.formats.json.JSONFormat().write` |
395+
| `cloudevents.conversion.to_dict` | `event.get_attributes()` |
396+
| `cloudevents.kafka.KafkaMessage` | `cloudevents.core.bindings.kafka.KafkaMessage` |
397+
| `cloudevents.kafka.to_binary` | `cloudevents.core.bindings.kafka.to_binary_event` |
398+
| `cloudevents.kafka.from_binary` | `cloudevents.core.bindings.kafka.from_binary_event` |
399+
| `cloudevents.pydantic.CloudEvent` | Removed |
400+
| `cloudevents.abstract.AnyCloudEvent` | `cloudevents.core.base.BaseCloudEvent` |
385401

386402
## CloudEvents Spec v0.3
387403

@@ -396,9 +412,11 @@ event = CloudEvent(
396412
"source": "/myapp",
397413
"id": "123",
398414
"specversion": "0.3",
399-
"schemaurl": "https://example.com/schema", # v0.3-specific (renamed to dataschema in v1.0)
415+
"schemaurl": "https://example.com/schema",
416+
# v0.3-specific (renamed to dataschema in v1.0)
400417
},
401418
)
402419
```
403420

404-
Binding functions auto-detect the spec version when deserializing, so no special handling is needed on the receiving side.
421+
Binding functions auto-detect the spec version when deserializing, so no special
422+
handling is needed on the receiving side.

0 commit comments

Comments
 (0)