Back in February, OpenAI absolutely baffled everyone with its video-focused Gen-AI application Sora which can literally take about 2 lines of text prompts and make them into a 10-second FHD video. Even though the rouse was quickly distinguished after the world realized how much computing power was required to generate it, most people would think that only world-class, industry-leading companies would have the resources or access to such a tool at full capacity. But that’s not really the case according to a recent post by one of the GPU CSP Backprop.

OpenAI Sora Demo 2

MSN’s Jowi Morales cited the company’s blog post stating that “With a GeForce RTX 3090, you can generate videos up to 4 seconds at 240p resolution. Generation takes around 30 seconds for a 2-second video and scales linearly at that proportion. A higher resolution than 240p will cost more VRAM than the RTX 3090 has”.

Gigabyte RTX 3090 Gaming OC 24G 14

Yes, you heard that right, consumer tier users “CAN” use Sora, but even the very first “90s” tier card from NVIDIA hosting 24GB of VRAM is no opponent to the resource intensity of the AI tool. Clearly, Sora was built with professional and data center GPUs and accelerators in mind. For example, Team Green’s currently available Hopper-based H100 GPU can be configured up to 94GB of HBM2e memory while the H200 does so at 141GB HBM3e, not to mention they have higher memory bandwidth as well.

Sounds good right? Kindly check that you have a free US$30,000 in your bank account before copping one of the H100 with a PCIe connector. Can’t do it? Then maybe at about 25% of that cost, you can find an RTX 6000 Ada Generation card with 48GB VRAM instead.

OpenAI Sora Demo 1

Even if we disregard the money aspect, you’ll also run into copyright and IP issues because as most people know, OpenAI is constantly within the fire of lawsuits left and right over content ownership infringement. Who knows if public users of Sora will become collateral victims in the future?

But suppose a private fork was given access to, let’s say, Disney or whatever animation/movie/cinematography studio, and the content was trained, refined, and deployed within a confined space using only their own IPs. In that case, I have zero problems with that.


Related Posts

Subscribe via Email

Enter your email address to subscribe to Tech-Critter and receive notifications of new posts by email.

Leave a Reply