You should try Chillin(https://chillin.online), browser-based video editor. Powered by WebGL, WASM, and WebCodecs, Chillin provides a full suite of video editing capabilities on the web. Trust me, Chillin is smoother than most native video editors, even on mobile. I believe Chillin can leverage WebGPU to bring even more powerful rendering features.
Yes, running LLMs on the web may not have significant advantages due to the speed limitations, but other models, such as those for bg removal, speech-to-subtitles, and translation, could become practical and efficient thanks to WebGPU.
Web video editor(https://chillin.online), we are eagerly looking forward to the WebGPU API maturing and being extended to all major browsers, enabling faster rendering, bringing more effects, and facilitating the rendering and editing of 3D assets.
Impressive blog! I am building a professional web video editor - https://chillin.online and trying to embed various AI workflows into it. Your article has given me a lot of inspiration. Thank you!
Congrats on launch, fantastic work! The smooth experience is impressive. I launched an online video editor, Chillin - https://chillin.online, which is also WebGL-based and renders very quickly. I'm interested to know if Repaint will migrate to WebGPU, as Chillin is planning to do so. I believe that web apps using WebGPU for rendering can be even faster than native apps!
We're not sure. WebGPU seemed too early to us. But as I got deeper into WebGL, the ability to do more arbitrary computation sounded more and more appealing. I wouldn't be surprised if migrated down the road.
Chillin - Next-Gen AI Video & Motion Editor, I'm working on the online video editor, https://chillin.online.
Chillin has a significant advantage over its competitors(veed.io, clipchamp, capcut online), eg it supports mobile devices, offers full keyframe support, no watermark high-resolution video exports, and supports vector motions.
So I feel like I am changing the world through this work.
Hi HN,
We are excited to announce that Chillin Online Video Editor has officially launched its new free AI subtitle feature. With the recent shift to paid subtitle functionality by other platforms, we understand many users heavily rely on this feature. Now, Chillin offers a new solution.
Why Choose Chillin's AI Subtitle Feature?
Free: Chillin’s local AI subtitle feature is completely free, allowing unlimited generation and export of SRT subtitles.
Multi-language Support: Supports over 100 languages.
No Installation Required: Chillin is the latest online video editor, offering professional video editing features without any installation.
Privacy Protection: Our local AI subtitle feature is based on Whisper running on WebGPU, ensuring no data is uploaded.
Powerful Subtitle Editing: Supports subtitle timestamps, styles, backgrounds, animations, and smooth previews.
Note: Due to memory constraints, the free AI subtitle feature is only available on desktop browsers that support WebGPU. We recommend using Chrome for the best experience.
Hope you can try it!
Your editor downloads a 32.6MB ffmpeg WASM binary on every page load.
Throttling the network to "Slow 3G", it took over four minutes of a broken interface before ffmpeg finally loads. (It doesn't cache, either.) A port of the Audacity audio editor to web[1] with WASM takes 2.7 minutes on the same connection, so the binary is totally reasonable, but I think claiming less than 2 MB is disingenuous.
Sorry for that, we just focus on js bundle and don't realize how big the ffmepg.wasm is. Thanks for reminding, next step we will try to rebuild ffmepg.wasm and make it smaller.
"Versus our initial moment-based implementation, in Chrome we see a 78% improvement (183.93ms to 39.69ms), in Firefox a 90% improvement (269.80ms to 24.88ms), and in Safari an 83% improvement (166.56ms to 27.98ms)."
Dom elements are expensive, so probably not down to v8 itself in that second comparison.
Generally one should expect significantly higher performance with Rust compiled into (optimised) Wasm. 2x, 10x. I don't have strong numbers in hands to share now.
But in some minority of cases, it might be slower than on v8 the latter has a few extremely highly optimised JS operations.
Eh, would rather see a comparison to a webGL approach given we’re doing tweening here. Fine to not have one, just sorta leaves the question up in the air. Faster than Canvas at the very least!
Shaders would only (tremendously) improve rendering vs dom elements.
And a Wasm can also leverage the GPU so it would yield similar performance comparing apple to apple.
V8 interpreting JS for CPU computation I think is what OP was asking for, as it is relevant to determine the best optimised route to get the highest performance on compute.
If most published benchmark are correct*, then a GPU for graphics applications complied into Wasm coming along CPU
compute logic would perform better than its JS+WebGL counter part.
Would be nice to benchmark that to confirm.
*and they probably are as Wasm executes at near native speed.
V8 executes certain operations at near native speed. The rest takes the overhead hit of the interpreter.
I'm sorry I just may not understand. My point is that even graphic intensive games also are compute (CPU) intensive, so significant performance differences is where it matters. Webgl, WebGPU, wasm all execute shaders on the GPU I don't see the overhead to be critical and of much interest to benchmark. I could be wrong and some drastically different results could be observed but what I've read on the subject indicates it won't. Or I don't understand what you mean.
Yes, running LLMs on the web may not have significant advantages due to the speed limitations, but other models, such as those for bg removal, speech-to-subtitles, and translation, could become practical and efficient thanks to WebGPU.