In the next installment of our 3D Future series, we explore how AI, automation and no-code solutions will help to scale Metaverse experiences and platforms.
You can explore technical principles and axioms, or even search for a standardised definition, but ultimately the promise of the Metaverse is being granted the ability to shape your own surroundings. Through avatars and digital fashion, a Metaverse user has the license to present themselves in any form of their choosing, and moreover, within these virtual worlds, there’s no constraint on the virtual activities/experience. Virtualising standard experiences such as concerts and graduation ceremonies is just the starting point, and once the Metaverse matures we can expect any imaginable experience to be created or recreated. For example, following the success of the hit South Korean show ‘Squid Game’ on Netflix, people started creating their own versions of Squid Game directly within Roblox.
As a society, since human consciousness, people have taken inspiration from their surroundings and interpreted or made objects as a form of self-expression. You only need to reference the Lascaux cave paintings to comprehend this concept. And now, with the rise of three-dimensional modeling and platforms that enable the global sharing of experiences, self-expression once limited to sketches, paintings or poetry, has manifested into the creation of digital worlds, worlds in which the creators themselves can not only occupy and inhabit, but also continuously evolve.
This promise of creating what you want, when you want, is a powerful on-ramp to get users to adopt, create and experiment with Metaverse technologies and platforms. The challenge then is enabling these users to create and populate virtual worlds with relative ease and low barriers to accessible production. The question is, how can we empower users to create immersive and engaging content without the investment of considerable time and effort currently associated with XR and game development?
The first technology to consider, which will help to scale and democratise content creation, is artificial intelligence. More specifically, semantic speech commands that can interpret instructions and generate assets according to a user’s requirements. Meta very recently demoed Builder Bot which is an AI tool that does exactly this. In the demo, Builder Bot was used to create a very specific virtual world through standard voice commands, including “let’s go to the beach” and “let’s add some clouds”. Meta CEO, Mark Zuckerberg said: “You’ll be able to create nuanced worlds to explore and share experiences with others with just your voice”. While this is still a prototype, it’s apparent how AI tools such as this will help content creation scale.
Another technology to support the democratisation of content and experience creation is automation via commands and tools. Once a piece of content has been created, it still needs instructions on how to behave when interacted with or stationary. In traditional development, considerable effort goes into character behaviour for NPCs (non-playable characters). Elements including rigging and animation are often time-consuming and technical to build, which presents a barrier to the scaling of user-generated content. This is why innovations in automation are helping address this bottleneck. For example, NavMesh is a library that can help characters intelligently navigate environments and worlds. Users can offload all of the technical path creation and pathfinding and only focus on the ‘interesting’ parts of their experience. Mixamo is another example of a tool that automates rigging and animation for characters, which users can export and drop into scenes and experiences.
The final technology that we’ll discuss is more of a trend than a standalone technology but a necessary inclusion in the list. No-code tools are pretty self-explanatory: they’re tools that allow users to create technical output without needing to write any lines of code. And while the examples already covered above also count as no-code tools, they only address part of the end-to-end experience and still need to work in conjunction with other technologies and platforms. No-code tools that will enable the Metaverse to scale will allow users to generate a fully working, interactable experience with a few clicks.
Tools including our own Blippbuilder, which lets users start from scratch and generate real, working AR experiences - securely hosted and distributed - within minutes. Blippbuilder is an example of a no-code tool that truly democratises AR content creation and lets anyone create a compelling experience as simply as creating a PowerPoint presentation. This low cost tool, which includes free views for non-commercial use for each published experience, allows users to instantly drag-and-drop features to create immersive content for their campaign, favourite show or anything else that comes to mind and easily share it to allow others to properly experience their own unique Metaverse content.