WEBVTT 00:00.000 --> 00:12.000 Now we're going to have a session of five lightning talks, five minutes, and there 00:12.000 --> 00:18.560 is no question on yet the end. We can ask a question. And the first speaker is 00:18.560 --> 00:25.160 today, he's going to talk about Miru, building a collaborative video editor with offline 00:25.160 --> 00:31.560 support. Let's welcome him with a round of applause. 00:31.560 --> 00:41.320 Hello, thanks. I'm Ty Adiyami. I started the project Miru about a year and a half ago, 00:41.320 --> 00:46.320 and I'm building media editing tools for web apps, so they can be put into existing web 00:46.320 --> 00:55.400 apps, or used independently. The work is funded through NGI-0, established by the NONET Foundation 00:55.400 --> 01:04.320 and financed through the European Commission. So that's the URL of the video editor that 01:04.320 --> 01:10.640 I'm currently working on. It's very simple, but it's also very easy to use the ideas that 01:10.640 --> 01:17.640 you can jump in, and pretty much figure it out in five minutes. You can just add clips, rearrange, 01:17.640 --> 01:25.640 etc. Compositing is done with the video context library for now. I'm going to move to something 01:25.640 --> 01:32.520 better. Render to a WebGL canvas up there, and using Web Codex for exporting, which is pretty 01:32.520 --> 01:40.560 new, not supported in all browsers yet. So actually going back, the video editor is like 01:40.560 --> 01:50.480 the data is modeled as a tree. So the timeline is a tree of nodes. So each track is a node, 01:50.480 --> 01:58.160 the clips in the tree are nodes. Yep. And so when I was thinking about which CRDT to use to 01:58.160 --> 02:05.720 make this collaborative, well first, I had to pick a library. There's a lot of them. This 02:05.720 --> 02:13.160 is a great website. Amazing. Thank you whoever made it. It lists so many different things, 02:13.160 --> 02:19.320 and you can filter for whatever criteria you need. And my criteria were basically that I 02:19.320 --> 02:25.000 wanted something very small, because it's an embedded editor. I wanted something also ideally 02:25.000 --> 02:32.040 that could handle trees, but I didn't find a great one, because YGS was the most popular 02:32.120 --> 02:37.320 great ecosystem very small. Auto merge was comparable. I think the ecosystem was smaller, 02:37.320 --> 02:43.800 but it might have changed by now. I think it's much more actively developed. But it's 02:43.800 --> 02:49.800 wasm build, so it's pretty big. Laura is new, has move operations, so it would be like kind 02:49.800 --> 02:55.960 of perfect for a tree, but also wasm build. So I stuck with YGS. And then if you're building 02:55.960 --> 03:01.400 any collaborative design software, I think it's impossible to miss like this article, especially 03:01.400 --> 03:09.160 by Figma, creator or co-founder Evan Wallace. And I also found this, so tree-based indexing, 03:09.160 --> 03:16.200 so basically CRDTs for trees, and it ended up being like a near-perfect fit. And then I found this 03:16.200 --> 03:24.760 library, which like implements what was described there in YGS. So it basically you can have like 03:24.760 --> 03:30.120 trees, directories, you can move stuff around add children, update them, and it'll make sure there's 03:30.200 --> 03:36.440 no cycles, it doesn't break the tree or end up with duplication. But it does have the issue of 03:36.440 --> 03:42.040 interleaving. So like if you add one to three items here and another user adds one to three, 03:42.040 --> 03:47.000 they'll just instead of like going one after the other, they might just end up mixed up, which for 03:47.000 --> 03:54.040 a directory is fine, because it's not exactly, it's sorted by like time of creation or name. 03:54.600 --> 03:59.880 But in a timeline, if you have the clips just like coming up between each other doesn't make 03:59.960 --> 04:05.080 any sense. Like for Figma, it's fine, because Figma, you are always online. There's no offline 04:05.080 --> 04:09.800 editing, so it's not, you're not going to have the time for stuff to get interleaved. So like, 04:09.800 --> 04:15.000 let's pretend that part is solved, I can borrow stuff from text-based CRDTs to resolve the 04:15.000 --> 04:25.800 interleaving. But what, like, how do I figure out what the user expects to happen when they do 04:25.880 --> 04:32.920 video editing? Like, what if they split a clip in two? So there's now two clips where they 04:32.920 --> 04:41.080 used to be one, and then they resize these clips, but then someone else edits them, but someone else 04:41.080 --> 04:47.240 goes back and let's say delete the original clip or changes the duration or the media source, 04:47.800 --> 04:54.600 then how should that be resolved? There's like different ways. I used to implement this as 04:55.320 --> 05:01.160 one, the first clip gets shorter, and then a new one is added, but that resulted in a lot of 05:01.160 --> 05:06.440 issues, so I just kind of have to juggle things around. There's other issues, like, just the big problem 05:06.440 --> 05:10.280 is figuring out what the expectations are, and to do that, you kind of need to know what the 05:10.280 --> 05:15.320 intentions are, which is why you're underpacking the CRDT for a kind of every application, basically. 05:15.960 --> 05:21.640 Like, what about subtitles and transitions and keyframe animations or, like, moving groups of things 05:21.640 --> 05:30.280 together? I honestly don't know yet. I'll figure it out eventually, I hope. And what about weird stuff, 05:30.280 --> 05:36.520 like, there's a diff loading here, but like, there's, in video editing, there's a concept of ripple 05:36.520 --> 05:42.200 editing where you're editing multiple clips at the same time, basically, for specific with specific 05:42.280 --> 05:48.680 intentions, and I'd have to, like, figure out how to make, like, sort of encode this intention 05:48.680 --> 05:57.160 in the CRDT, and that's hard. Yeah, so just lots of user testing, lots of feedback, come to me 05:57.160 --> 06:03.560 if you have any advice. Thank you very much.