Interested in learning what's next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Register today.
Making movies like the pros doesn’t require a pricey camera rig or a tricked-out editing suite — at least, that’s Polarr’s pitch. The San Jose, California-based startup, which was founded by Google veterans Borui Wang and Derek Yan and which recently attracted $11.5 million in venture capital backing, today announced 24FPS, an editing app for iPhone and iPad devices designed to “help amateur … creators build high-quality video.”
It’s available starting today through the iOS App Store.
“24FPS is [intended] for busy people who want a premium look and feel for videos,” said Wang, who serves as CEO. “[Computer vision] is the core ingredient to provide the unique experience of recording, AI filter recommendation, and video editing in 24FPS. We look forward to market feedback and to iterating to improve the app.”
Built-in recommendation algorithms are 24FPS’ spotlight feature, without a doubt. They match regularly updated cinematic filters (in the form of lookup tables, or LUTs) inspired by popular movies with specific scenes, drawing principally on color preferences that the app’s AI engine learns with each use. 24FPS first susses out users’ preferences through an interactive quiz, and it continues to learn and refine its understanding of these preferences over time.
Join today’s leading executives at the Low-Code/No-Code Summit virtually on November 9. Register for your free pass today.
Complementing the filter recommendation component are cinematic stabilization and inertia-based zoom controls, plus “beautification” effects that thin faces and smooth skin. Users can import, combine, edit clips with text and music, and insert stickers and additional clips and split those clips or adjust their speed. And 24FPS supports both LUT filters and all QR filters from Polarr’s Photo Editor app.
Polarr 24FPS benefits from the Polarr Vision Engine, a hardware-agnostic stack purpose-built to enable computational photography on a range of devices. It comprises a set of self-trained neural network models — each individually compressed and optimized for on-device storage, RAM usage, and power consumption constraints — that underlie tech from Qualcomm, Oppo, HoverCam, and others.
For example, the Polarr Vision Engine powers Composition Guide, a real-time feature for the Samsung Galaxy S10’s native camera app; identifies optimal compositions; and provides interactive guiding prompts. Moreover, the Polarr Vision Engine is at the heart of the company’s first-party cross-platform photo editing apps for macOS, iOS, Android, and Windows 10. These apps include Polarr Photo Editor, Album+, and Deep Crop.
Polarr’s core team of about 24 members is spread across offices in Shenzhen and Silicon Valley and includes graduates from educational institutions like Stanford, Carnegie Mellon, and Duke and former employees of companies like Microsoft, Google, Qualcomm, and Baidu.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.