we have folk practices for light-weight version control such as screenshots, copy / paste, duplicating files & layers.
these are all examples of snapshots and branching. we don't have the equivalent for light-weight merging.
auto-merging solutions in CRDTs get us part of the way there, but the way we think about user intent and merging is still too low-level.
in a text or list CRDT, we model the inserts and deletes and have rules for how those changes should converge.
event sourcing also helps with this "the user did x action, the data changed in this way".
but with AI we can begin to think about changes as semantic intent.
"the character got older", "the time of day changed from morning to night", etc.
but it's not just about making merging simpler to reason about, these tools are making deconstruction and remixing an integrated part of the creative process.
by being able to say "i have this video but i want to pull out the color palette from this image and apply it to it" or "i want to apply the mood of this song to the composition of this image", it changes merging from being a tedious task into something fun and creative.
it goes further than this though, ai tools are a form of "predicting future intent" & so it's going to get things wrong. you want to be able to have workflows where you can seamlessly duplicate media and try things, to create many variations & rip out the properties of the ones you like for later use.
it feels a bit like foraging or collecting a heap of snippets, examples of ideas that you like for later recombination. the mixed-media moodboard becomes an integrated part of the creative process.
by bringing collaborative workflows closer to the solo experience of working with media, it breaks down the barrier to collaborating with others. it becomes less of a stretch to invite someone else into the space.