We all know AI is going to be the main force of decision-making and automation in the future and one of the main driving forces in the industry is none other than NVIDIA with their use of deep learning models already being used in games, self-driving cars, and even creating the Metaverse as we speak.
Until you realized that the jobs of 3D modeling artists are potentially dying thanks to NVIDIA 3D MoMa.
Showcased during the CPVR conference, the new AI is capable of empowering all sorts of job positions that require 3D modeling such as architects, designers, concept artists, and game
developers by drastically cutting down the actual sculpting process and instead, redirecting the remaining time for creativity use and whatnot. Based on the technique of inverse rendering that reconstructs a series of still photos into 3D models, NVIDIA basically trained an AI model via GPU acceleration to do that job by learning to piece up 2D info into triangle meshes that are used to define 3D graphics.
Delving a little deeper and NVIDIA explains that a single Tensor Core GPU can generate models within an hour with the pipeline outputs directly becoming compatible with 3D graphics engines and modeling tools for immediate deployment which contains a full 3D mesh model, materials, and lighting. That’s right, the algorithm can even portray proper lighting by default so that little post amendments are needed.
To show off a sample case, NVIDIA’s research and creative teams demonstrated themselves by collecting around 100 images of 5 jazz band instruments – trumpet, trombone, saxophone, drum set, and clarinet, from different angles. Fitting these images into the NVIDIA 3D MoMa model generates a base output which they imported into the NVIDIA Omniverse 3D simulation for further editing. Here, materials can be swapped out while maintaining the shape so you might as well make it out of diamond or gold or something. After that, they dropped the sculpted instruments into a Cornell box for a rendering test which showcased a great detail in lighting in terms of reflection and absorption.
Looking from a broader scale, the technology would be hugely utilized for cutting down development time by quickly importing real-life assets into the digital world directly or as a base to be further sculpted into fictional items. As a result, the entire dev cycle would either be generally shorter, or more content can be crammed into the game. In either case, this is a good news for fellow 3D artists as their grunt work can finally be reduced.
To learn more about NVIDIA 3D MoMa, better watch the original video to get a better context.