Combining AI & Blippbuilder to create Metaverse experiences

We explore how AI content can be used inside Blippbuilder to build immersive experiences at scale.

The number of Generative AI tools has exploded off the back of advancements in AI, specifically diffusion models like Stable Diffusion. These tools take simple, natural language text inputs and interpret them to return detailed outputs based on guidance from the prompt.

Credit: Jay Alammar

Credit: Jay Alammar

Tying this back to the Metaverse, we’re still navigating what the term actually means and how it will materialise, but there are a few things that we know for certain... 

One of the requirements for any Metaverse initiative to be successful is a massive increase in available content, content that's both readily available and created on demand. With more and more AI tools coming to the market, and the barrier for AI content generation continuously reducing, an increasing number of users can create powerful media outputs to help populate the Metaverse.

Credit: Sequoia

Credit: Sequoia

The Metaverse presents an ideal medium to share this new AI-generated content, and, on the other hand, this content helps to populate Metaverse experiences and adds more richness and immersion at scale. The question then becomes: how can we bridge these two complementary fields to continue building next-generation experiences and get more value from these tools?

Blippbuilder is the perfect solution to enable this:

  1. Blippbuilder supports the import all media types generated using AI tools including images, videos, audio, 3D models etc.

  2. Blippbuilder is a no-code platform, just like most AI tools that use natural language input prompts.

So let’s look at examples of how AI and Blippbuilder can be combined to produce Metaverse experiences at scale.

AI Image Generation

One of the biggest generative AI use cases is text-to-image generation and tools like Dalle-2 and Midjourney have made it incredibly easy to produce highly detailed images. These images can be used to fill out parts of an experience without having to source/produce bespoke content. In the example below, the portraits on top of the 'Day of the Dead' altar have been generated using Midjourney to bring the experience to life. Check out the experience using this link.

Day of the Dead AR experience created in Blippbuilder

Day of the Dead AR experience created in Blippbuilder

Synthetic Avatars

Another really powerful AI tool is Synthesia which lets you generate AI avatars and custom videos from plain text. You can create videos with an AI avatar narrating a script of your choice and set it to play against a green screen background. Then using the chroma key setting in Blippbuilder you can make your AI avatar appear in your scene as a virtual host to support a range of use cases, including: onboarding, training, how-to videos, product demos and so much more. Check out this onboarding demo using a Synthesia avatar as a hologram.

Combining AI & Blippbuilder to create Metaverse experiences

BariumAI Textures

Another example is using text-to-material generators like BariumAI, which generates textures and materials for 3D assets. It is built on Stable Diffusion and can be used to generate 8k textures and maps (height, normal, emission, roughness and metallics) with simple text guidance/prompts. In the example below, Barium was used to generate textures for the planets, creating stunning new worlds that elevate the experience.

Combining AI & Blippbuilder to create Metaverse experiences

There are several other AI tools that can further enrich your Blippbuilder experience. Tools such as copy.ai for writing marketing copy, endel.io for generating custom sounds and music, play.ht for generating custom voiceovers and so much more.

This really is an exciting time to try out new generative AI applications to produce rich content at scale, and Blippbuilder is the perfect melting pot to bring it all together in a powerful Metaverse visualisation, which you can share with the world.