You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We also support basic migration strategies that can be applied for the JSON with Newtonsoft (so adding the new field, etc.).
This is the most basic stuff and not all more advanced scenarios.
Upcasting
The most typical scenario for events is upcasting. Sometimes we'd like to be able to not maintain multiple event schemas (like OrderCreated, OrderCreatedV1, OrderCreatedWithQuantity etc.). Schema changes might be different:
renaming property,
changing property type (eg. from string to DateTime, from int to Enum, from Product to PricedProduct),
changing the structure properties and nested objects structure (eg. previously it was an enum, now it's an object, nested objects are highly restructured etc.).
I was thinking that maybe @jeremydmiller black magic with the code generation could be helpful here? At least for defining the basic mapping it could work. Maybe some semi-automapper higher abstraction could work.
However, I still believe that this should be built on top of some flexible logic that would be injected between deserialization and C# class instance creation, so our users could extend it. It would be hard to support all possible mappings that user can come up.
Of course, performance would be also a factor for choosing a solution.
I think that the same mechanism should be used in:
stream aggregation,
projections rebuild,
etc.
We may think also about making this mechanism more generic and give the possibility for our user to define pipelines/decorators. That could be used for a series of upcasters or more generic stuff like decoding, deciphering and other transformations.
It's not a must-have but sometimes it happens that when logic is changed then the event should be split into multiple events - eg. OrderCompleted into OrderPaid and OrderShipped (or the other way round). We could also consider that.
Publishing events in old and new schema
Another versioning strategy is a publishing event in old and new schema until there are subscribers that are handling on schema version. Currently, it's not helpful for Marten focused more on a single instance, but if we go forward more distributed environment (with Async Daemon) then it might be worth considering.
Schema validation
We might add to Marten support for Json Schema (https://json-schema.org/) and validate if the published event matches the defined schema. We could use that to eg. provide out of the box contract tests by eg.:
storing schema of the current version of the event,
give method like AssertEventsMatchesSchema - that would eg. create events with default values and empty, or even give user option to write their own and get JSON schemas from db and perform validations,
We could also provide or find other notation (eg. in a fluent way) that would allow easier validation.
Of course, Events should be immutable, of course, we should not change events stored in the database, especially if we're always publishing them, however...
...sometimes it's the easiest way to do. Not only for events but especially for Documents we could consider some helpers for updating the data migration. This can be a separate package.
The text was updated successfully, but these errors were encountered:
@jeremydmiller This would be nice to have in v4, but this can be added later as non-breaking. In practice, this would require adding some middleware between deserialisation and applying events during the aggregation/projections apply.
Edited by @oskardudycz
Overview
Currently, Marten supports two easiest way of Event Schema Versioning:
More on that here: https://martendb.io/documentation/events/versioning/.
We also support basic migration strategies that can be applied for the JSON with Newtonsoft (so adding the new field, etc.).
This is the most basic stuff and not all more advanced scenarios.
Upcasting
The most typical scenario for events is upcasting. Sometimes we'd like to be able to not maintain multiple event schemas (like OrderCreated, OrderCreatedV1, OrderCreatedWithQuantity etc.). Schema changes might be different:
string
toDateTime
, fromint
toEnum
, fromProduct
toPricedProduct
),I think that as the minimum we should provide some mechanism that's close to the serialization process. Java Axon framework does that (see e.g here: https://docs.axoniq.io/reference-guide/axon-framework/events/event-versioning). I also found some project like https://github.com/Weingartner/Migrations.Json.Net that does such stuff. I'm not a huge fan of such API, it's not the most user friendly, but at least it gives all options and let user does everything.
I was thinking that maybe @jeremydmiller black magic with the code generation could be helpful here? At least for defining the basic mapping it could work. Maybe some semi-automapper higher abstraction could work.
However, I still believe that this should be built on top of some flexible logic that would be injected between deserialization and C# class instance creation, so our users could extend it. It would be hard to support all possible mappings that user can come up.
Of course, performance would be also a factor for choosing a solution.
I think that the same mechanism should be used in:
We may think also about making this mechanism more generic and give the possibility for our user to define pipelines/decorators. That could be used for a series of upcasters or more generic stuff like decoding, deciphering and other transformations.
It's not a must-have but sometimes it happens that when logic is changed then the event should be split into multiple events - eg.
OrderCompleted
intoOrderPaid
andOrderShipped
(or the other way round). We could also consider that.Publishing events in old and new schema
Another versioning strategy is a publishing event in old and new schema until there are subscribers that are handling on schema version. Currently, it's not helpful for Marten focused more on a single instance, but if we go forward more distributed environment (with Async Daemon) then it might be worth considering.
Schema validation
We might add to Marten support for Json Schema (https://json-schema.org/) and validate if the published event matches the defined schema. We could use that to eg. provide out of the box contract tests by eg.:
AssertEventsMatchesSchema
- that would eg. create events with default values and empty, or even give user option to write their own and get JSON schemas from db and perform validations,We could also provide or find other notation (eg. in a fluent way) that would allow easier validation.
We could do something like: https://github.com/VerifyTests/Verify.
Migrations
Of course, Events should be immutable, of course, we should not change events stored in the database, especially if we're always publishing them, however...
...sometimes it's the easiest way to do. Not only for events but especially for Documents we could consider some helpers for updating the data migration. This can be a separate package.
The text was updated successfully, but these errors were encountered: