More than 95% of our codebase is written in Flutter (Dart), and is shared between our Native and Web Apps. To work with videos and perform other platform operations we write code in C++, Swift, Java, JS, and WASM. Finally, our backend, machine learning, and analytics tools are written in Python and NodeJS.
Our state-of-the-art video editor uses a modified Flutter engine to render layers on screen, react to user inputs and exporting the composition.
Our rendering pipeline takes care of displaying layers (shapes, images, videos, stickers..) and responding to user inputs, such as movement and rotations. Most of the pipeline is handled and manipulated in Dart and rendered in pure Flutter. For videos, we modified the Flutter engine to be able to extract frames as Dart's ui.Image(s) in the fastest possible way which is not currently allowed by the engine.
To export videos, we modified the Flutter engine to be able to extract asynchronously from the UI, in an accelerated way, frame-by-frame the results of the rendering pipeline. These frames are then aggregated at a platform level and encoded in a video that our users will simply share on their social media. This process allows us to be mostly platform agnostic.
Our goal is to provide a seamless editing experience; while most projects might start and end on mobile, there are some features which work better on wider screens. That's why we allow users to start editing on their phone while still having the freedom to finalise their projects on a different device.
We always implement consistent UI elements across platforms to provide our users a seamless cross-device experience. This is why we created Magma, our reusable UI component library, that plays a vital role in our development process.
TCA is a, Redux-like, unidirectional dataflow library, which simplify state management in a consistent and understandable way. We adopted this paradigm from its inception and have decided to port it onto our technological stack in Flutter.