Web3 Galaxy Brain 🌌🧠

Subscribe
Web3 Galaxy Brain

OpenVoxels with Jin (@dankvr)

1 November 2022

Summary

Show more

Transcript

Nicholas: Welcome to Web3 Galaxy Brain. My name is Nicholas. Each week I sit down with some of the brightest people building Web3 to talk about what they're working on right now. Today I'm joined by XR researcher and creator, Jin. You may be familiar with Jin's work building and evangelizing open and interoperable VR and AR experiences through the many projects he's been affiliated with, such as Webiverse, Exokit, M3, The MetaFactory and more. In this conversation, Jin and I discuss his newest project, OpenVoxels, an effort to fund research and archiving activities focused on crypto voxels. We discuss the recent Webiverse NFT drop, M3's latest discoveries, and his upcoming Burning Man-like party that's set to take place in an archival snapshot of crypto voxels. Jin is a luminous figure in the web-centric XR research community, and it's always a pleasure to talk to him about what he's been learning and developing lately. I hope you enjoy the show. Hey everybody, welcome. Welcome to Galaxy Brain. Today I'm going to be talking to Jin with my broken voice from being at a conference for over a week now, all about OpenVoxels and hopefully a little bit on Webiverse and three other things. So Jin, thanks for coming on the show.

Jin: Yeah, no problem.

Nicholas: So today we're going to talk all about Metaverse stuff in big quotes. And actually I had you on the show on April 1st, 2021. I saw. Or is it 2021 or 2022? Good question. But April 1st, one of the two. Must have been 2022. Been a little while. That was you were building Webiverse. Had you and Avere on the show talking about the launch, talking about the questing stuff you were thinking about. What's new since then? I want to hear about OpenVoxels, but maybe we could start with like a tiny update with what's happened with Webiverse since then. I know you had a really successful Mint and maybe you could share a little bit about how you're involved there and how things are going.

Jin: So Webiverse got through the Mint and now it's a lot of heads down still building. If you look at the GitHub, there's a lot of stuff going in there at the moment, especially related to AI. And Webiverse has always been like doing a lot of AI stuff if you look through its history, but it's just only been supercharged now, especially with stable diffusion. There's some things going in there like stable diffusion for 3D. Things that are pretty mind blowing in terms of text to whatever, because then it's like kind of a full circle thing. You can do text to image and an image to text. And it just kind of reminds me of like secondhand smoke and inhaling it and just hotboxing, you know, this metaverse. I'm kind of working from the outside in by helping different projects who want to bring their collections into the open metaverse, such as by creating the VRM avatar collection from their current collection. So advising them and basically creating tools and pipelines to convert their assets and rig them for metaverse interoperability. So not only just avatars, but avatars can be pets and NPCs as well. So some of the things can basically come to life in all sorts of different ways.

Nicholas: And could you give us a little refresher on VRM? What makes VRM so cool?

Jin: Yeah, it's the kind of premier file format for avatars. It was designed from the ground up for interoperability by the Japanese VR community. And basically it's based on glTF 2.0, a open standard file format. And it can work with hundreds of applications already without needing a permission SDK, which is the case with like Ready Player Me or something where you go through the SDK to get a hook into the different platforms that support them. With VRM, it's just a file that you can upload and bring with you into different things. And they all work.

Nicholas: Would you say it's equivalent to something like SVG? Or maybe a little bit less standardized than SVG thus far?

Jin: It's less standardized than SVG at the moment. Version 1.0 was just released and some bugs are getting still ironed out in terms of the tool chain. But it's useful for all sorts of, like, not just virtual worlds, but YouTubing as well.

Nicholas: Okay, so it lets you create 3D models that have all the textures and maps required to render nicely. And they're coded in such a way that any platform or software that integrates VRMs will be able to read them. Do they also include behavior or just like physical aspects of the thing?

Jin: They could include not really behavior, but they could include blend shapes for different emotes. They have the physics for hair and body type of movements. And they also have the license information in the metadata of the file so you don't have to keep things separate in terms of the use slash license info and the actual file, which is how a lot of NFTs are designed. It can be all in one.

Nicholas: Cool. And do you think the fact that the spec is not entirely finished, we're headed into a revision of the spec, how does that affect projects that have already used or deployed VRMs for their NFTs, maybe in an immutable way?

Jin: It's pretty much no problem. There's tools that can update things and it's backwards compatible. It's going to be a seamless transition.

Nicholas: Cool. Are there any VRMs people are using today that people might have already seen out in the wild and not have known that they were looking at a VRM?

Jin: Crypto avatars definitely comes to mind. Crypto avatars, Toxam has made hundreds of open source avatars and many of them are some of those popular avatars in places like VRChat and you'll notice them everywhere once you see them. The Banana Guy, for example, is in a lot of places.

Nicholas: Cool. So that's VRM. You've been thinking about VRM and how does that relate to the Webiverse stuff that's being built right now?

Jin: Webiverse has been supporting VRM since day one and basically made it a pillar. In terms of custom avatars, that should be one of the pillars for building an open Metaverse. Aviar designed the best avatar system for the Web a few years ago and since then has been continuously building new features on top in an open source way so that the animations and whatnot can also be adapted for other places. VRM gets a lot of unlock in Webiverse's pipeline. They've built tools that you could essentially take a VRM and easily an artist can do this. You can add AI to it with any kind of AI backend that you want. It can be GPT-3, it can be GPT-J, it can be all these other things and you can train it to make it your own. and you can also use VRM in Webiverse to generate optimized versions of it. So let's say you have one that is 40 megabytes big because the textures and the polygons are just way out of control. There's a process in which it's being planned to be componentized into its own separate thing, but just by dragging and dropping and changing quality control sliders, it can generate a Doom sprite version of it so that all of that gets converted into something that's as optimized as a billboarded 2D sprite sheet that basically takes up only one draw call. And Averio has been working on some optimizations in general for 3D avatars to only have one draw call as well which basically implies a sort of groundwork for high CCU type of stuff, concurrent users.

Nicholas: Got it. Awesome. I remember you demonstrated last time we talked a few months ago that you could just drag VRMs, it was VRMs right? That you could drag into the interface of Webiverse and they would just show up and that if people were attaching those VRMs to their NFT metadata, when you connect your wallet you'd be able to just view and interact with those, either where the avatars are interacting with whatever the environmental things you were importing. Is that still a good example? Or maybe you could talk a little bit more about this machine learning, stable diffusion. I know I saw a tweet recently about someone in a 3D surround space where the surroundings were morphing in this stable diffusion exploring the latent space of a machine learning model kind of way. Is that the kind of thing you're imagining?

Jin: That kind of stuff for sure, one day, especially with the Dank Nugs project where you burn it and you get compute credits that can transform your current scene, that would be epic. Some of the stuff you can see, a new noon post about it recently, he's doing a lot of AI training for generating voices and 3D models and stuff, and just kind of building a whole storytelling framework in which you could sort of like, in the future I can imagine being in a discord or like a virtual campfire and the stories that you're telling about can come to life and it helps you build experiences. And one of the things that was teased recently was a 3D Dream Fusion output for text to 3D models that also get automatically optimized into voxel versions of it.

Nicholas: I'm imagining like a Sir Moore kind of interview where as they're speaking, the environment is changing to reflect the stories they're telling.

Jin: Yeah, I think there's going to be backwards compatibility with a lot of podcasts and stuff, especially with OpenAI's Whisper doing really good speech to text.

Nicholas: That's pretty crazy. Okay, so Webiverse has like basically a stacked bank account now and is able to spend a little bit more on building and working on integrating some new things and making the platform fuller. Has it changed the Webiverse vibe? Because I know when we first talked, my sense was that there were a handful of people who were interested in the discord, but now it must be many, many more people who are paying attention to Webiverse and then sort of network of ideas that are connected to it. Have you seen like a big change in the community or the size of the community or people paying attention to like relevant technical issues since the drop?

Jin: Not that much. I mean, yes, but it's also the bear market. And so a lot of people are like heads down building still. And I think that there's more eyes on the project. But we're very much all just focused on trying to get some of these features out to creators to build out their experiences. And the vibe has been, from what I noticed, like a lot of like hype and attention around. it's like AI creation tools, like really enhancing the creator workflow. And just a lot of excitement around that.

Jin: A little bit of both, you know, because Webiverse has really been really deep into AI stuff for a while. But it's like building still is once you kind of get the bug of like building virtual worlds, it just becomes more fun than sometimes exploring some worlds. I spend like maybe 10 hours building a world that I'll spend one hour enjoying or something. So I don't know if I'm the best person to kind of vibe check from the other side of the spectrum.

Nicholas: Do you think that's a common experience amongst people you know in the metaverse space that they love building more than they even love exploring?

Jin: The crowds that I hang out with, for sure. Those are the kind of groups that I instantly click with any platform that I visit.

Nicholas: For people who aren't as familiar with you, the rare few who don't know Djinn, who don't know DankVR. So you've been involved in Webiverse, but you have a longer history of working in metaverse related stuff, doing research and creating things like M3. Maybe you could like give a quick rundown of just all the projects that you're involved with right now so people have a sense of what to check out.

Jin: A lot of my projects and everything that I do, I work mostly in public and open source all my R&D for the benefit of the open metaverse. And so you can check it out. It's in my bio, hackmd.io slash addxr slash book, which contains HackMD dev logs for all the various topics from Avatar and Interoperability. If you Google Avatar Interoperability, you'll probably see my research as the first results. And so I do a lot around that. I do a lot around virtual productions because I think a lot of rivers and streams all flow into the basin of virtual production and game devs, especially in the world of NFTs. Those things activate NFTs.

Nicholas: When you say virtual production, you mean like VTubing or other kinds of video production inside of metaverse spaces or something else?

Jin: Yeah, correct. So the use of VR, real-time game engines, AI, basically real-time Hollywood, all purely 100% digital without needing like digital sound stages and whatnot. So we do like music videos, we do podcasts, we do interviews, we do shows, and we're working on a virtual production showreel in which we kind of showcase community members' projects. And this is our way of also showcasing Avatar Interoperability because we contribute a lot of feedback into working groups that are working on the specs, standards, and tools to improve things. But one of the best ways to also spread adoption of these open standards is by showing and letting the work speak for itself and also pointing to documentation and how others can do the kind of things that we do. So we try and generate content that can actually demonstrate Avatar Interoperability in practice, and it's just plain fun.

Nicholas: That's awesome. When you say we, who are the people you're talking about?

Jin: So M3 is like a virtual hacker space. There's a lot of projects that happen under one roof. The entire vibe is teach, learn, build. So we have weekly events on different topics throughout the week, every day. So like Mondays, we do virtual production meetups, Tuesdays, Avatar Introp, Wednesdays. We take a field trip to another Discord, like the Open Metaverse Interoperability Group. We contribute feedback to them on anything that we've picked up and et cetera. So it kind of acts as a place where people can bring their projects and work around other people that are working on interesting things. And that's, I think, the perfect environment for interoperability.

Nicholas: That's super cool. Maybe you recall, but I was around at the early days of M3, so it's fun to see that it's growing.

Jin: I recall.

Nicholas: How many people are involved these days?

Jin: It's a high signal group. I'll put it that way. We gate it by merit. So you have to complete a quest slash guide in order to gain membership by contributing like a presentation or something that you've worked on. I want everyone to have good collaboration skills. Teach, learn, build, you know? Not just extract value.

Nicholas: That's cool. M3 has no formal funding or anything, right? It's like a passion project for everyone who's involved.

Jin: It's been a passion project for years, but OpenVoxels is our way of starting to add some incentives to take things to the next level. We've been doing like pro bono interoperability stuff with CryptoVoxels for a long time, and they've really not graduated past that initial proof of concept stage. So I just want to take things a little bit more seriously in regards to that. I came into the whole Web3 space to, and same with whoever, so you know, like being open source from day one, we want to solve for some of the failure modes and sort of broken value loops of open source software development. So we spun up a Juicebox DAO, and we're distributing grants through Dwork to accomplish tasks.

Nicholas: So in Dwork, you're sort of setting up what you want people to be working on and then distributing the funds that are raised from the Juicebox, basically?

Jin: Basically, yeah. Dwork is amazing. Have you used it?

Nicholas: I played with it for a few minutes, but I haven't actually integrated it into any projects I'm working on.

Jin: It's amazing. You can have different roles that you can future, and then you can say how you can acquire them. It's super flexible. It's like Trello combined with GitHub and all these other things, and it's very composable. It doesn't add too much overhead. It meshes very nicely with Discord and GitHub. It's one of my favorite Web3 products in recent history, I can remember.

Nicholas: That's great to hear. So I'm curious. I was actually talking with a friend, Zug, who you might know, about this, and I'm a little skeptical, at least from my experience in the DAO. I'm mostly involved in Juicebox DAO, but also have spent time in Shark DAO, Party DAO, and a handful of other DAOs. And in my experience, the bounties have never been a major part of how we organize work. It's more been these long-term, multi-week relationships with a variety of goals for each person that are laid out in a proposal that they make in advance of being paid, in advance of executing the work. Whereas in Dwork, are you using this primarily to lay out bounties for people who are tightly affiliated members, or is the advantage that it lets new people come in and see what they should be working on?

Jin: I would say when entering M3, it still feels like you're entering a construction zone. So you need to pick up a hard hat and you see some things that are just laying around there, like, is that supposed to be there? And so it's still all very much work in progress. That said, the bounty system, I'm very interested in, just you can set up a reward for completing tasks in which a PR merge on GitHub automatically moves things into the done category, and then you can pay out through a Gnosis safe or MetaMask very easily, even in batch payments. And I'm interested in rewarding, not just like some tokens. Right now I've been distributing robot grants. Robot is the token for MetaFactory, but also NFTs and ERC-1155. So I want to kind of gamify things like when you go to a NPC and ask for a quest in a video game, and then they reward you with like a key or some kind of item, I want to do stuff like that. And maybe that key or item can unlock the next task or role you can accomplish.

Nicholas: And so by holding it in your wallet, you get access to additional tasks in the token-gated bounty system.

Jin: Yeah, that's my goal, sort of gamifying things in which you can level up and earn tokens. that also you can attach to your avatar and create a fashion statement around the metaverse about things that you've done, not just what you've paid for.

Nicholas: That's super cool. That's really, really interesting. So I know OpenVoxels is tied to this document where you've been researching, you know, basically put all of your research into crypto voxels. Could you explain a little bit how M3 and OpenVoxels relate specifically?

Jin: Yeah. So we had like a presence in crypto voxels from the very beginning. Makers District, which is just east of the center, is something that we named and it's the fourth oldest district. And it is a lot of us that, we knew Ben Nolan from years past and we wanted to explore the convergence of web VR and smart contracts and all that. So we co-located together and we've been doing Entrop experiments since then. I've got a lot of screenshots on the HackMD for voxels in which you can see like ways that we've been able to export crypto voxels into other platforms in terms of the avatar and the world. Because I think the concept of ownership, if you were buying an NFT for something, I mean, you should be able to bring that with you to other platforms. Otherwise, it's like the platform then owns you, especially for something that's like a home in a way. If it's hosted on someone else's server and domain, like it doesn't really feel like you own the lands, you know, and whatnot. So I'm also just really interested in the concept of preserving history. And I think that the world would be very different if we didn't have something like the Wayback Machine or Internet Archive for the web. Without history, without roots, society is easily reprogrammed. And I want to design something in which we could archive, because all these worlds are always changing. It's a dynamic virtual world on the Ethereum blockchain, but unless somehow implemented, which has been done by a community, there is no mechanism for the internet to have a memory. And so OpenVoxels is kind of like taking a bite out of this problem of like, how do we solve for memory and shared history? And the byproduct of that is we also are able to standardize these exports as interoperable 3D assets, VRM, GLTF, that we can bring content from Voxels into other places as well.

Nicholas: Yeah, it definitely makes sense to want to have the interoperability. I'm curious, you said the types of experiments that you're able to do without too much support from Voxels are like exporting the CryptoVoxels spaces into other experiences. And I'm looking at some screenshots here where you see like mixes of CryptoVoxels environments or assets with much more high fidelity experiences. that I guess are other like Webiverse or other metaverses.

Jin: Yeah, we've been able to port it to a ton of other game engines and platforms, VRChat, Webiverse. Substrata also has a Voxels loader, Janus Web, Hyperfi, many, many places. I mean, it's...

Nicholas: So I guess they're still fetching the data to be exported and then imported into another world from those centralized CryptoVoxel servers at present? That data isn't...

Jin: No, once you can do it once. So it's kind of like a way to snapshot a chain, right? But you're snapshotting a virtual world. Has anyone really like port a metaverse like blockchain platform yet? I don't think so. And I want to do that for many reasons. Like let's just say you want to experiment with a future, but you don't want to wait for devs to implement it because it might take them years. And literally it does. And you don't know if it'll ever make it into the main client. You can do so in your own sandbox. And that could be a way to pitch or propose a community of a future that you really like, such as what I have on my screen now. There's like a desert version, which would be sick for a Burning Man type of experience or a snowy version, you know, like weather, for example. You could do that on your fork. And that's something that it's permissionless once you have a copy of the chain.

Nicholas: So the purpose of OpenVoxels is to kind of lobby for CryptoVoxels to open up some of their code so that you can experiment more and builders can interact directly with the code?

Jin: I wouldn't say that's the direct purpose. I would say that's one of the benefits. And I think that it serves a real awesome use case where people can just sort of run with ideas rather than adding future requests, waiting and whatnot. And it's more conducive to an open metaverse ecosystem in which kind of works more like a open source project.

Nicholas: It definitely makes sense that there be metaverse projects, especially at this stage, if they're not going to be the Facebook version, they might as well be the complete opposites end of the spectrum and open source and accepting of contributions from the people who are interested, who can build things. It's also more like composable Ethereum aesthetic. So it makes a lot of sense to me. So OpenVoxels is not specifically about that. It's more focused on the interoperability of the avatars or how would you describe the mission?

Jin: And worlds.

Nicholas: Okay, and worlds.

Jin: Yeah. And I really like using CryptoVoxels and so many metaverse interop experiments because of how iconic this that it gets. You can sort of recognize it anywhere, right? Especially when it was all black and white, the world. So there's like layers of memetics and sort of shelling points because you could sort of compare CryptoVoxels to Minecraft on the blockchain, right? And many people know about Minecraft. So when playing around with CryptoVoxels, you can sort of imagine an abstract way to other types of platforms. And it just provides a really interesting case study. We want to basically form a working group to take these metaverse interoperability things more seriously. And OpenVoxels is a way to, I guess it's like a DeSci project. It's an open metaverse interoperability lab. We don't want to just make it all about nerdy science. We want to have fun. So I always liked wrapping science projects into art projects. And one of the things that we have planned for in a couple of months is something like a metaverse build-a-thon that is like a burn, a metaverse burn. And that's kind of what you're seeing lately on my timeline where I'm showcasing some sneak peeks and dry runs of this idea of combining OpenVoxels with OpenBrush and creating sort of neon night mode versions of it.

Nicholas: So how would you be executing this project? Would this be built on a snapshot of CryptoVoxels but presented in a different environment?

Jin: Correct, yeah. So we chose the date, well, I chose the date, 12-13 because that's the last time we took a full world-scale snapshot of CryptoVoxels back in 2019. So we're going to do another snapshot. By the way, there's been over a year of data backups for CryptoVoxels hosted on our weave. So we have data backups from every day for at least over a year as well. But I want to take a full mesh snapshot, which still requires a lot of human manual cleanup, and then remix it into many alternate realities, including a really sick night mode and have events where we gather all corners of Metaverse makers together and celebrate the arts and talk about interoperability in an environment that's inspiring.

Nicholas: I'm glad that Ben isn't stopping you from taking those snapshots like LinkedIn was trying to do a few years ago. At least you're able to...

Jin: He's been generally pretty chill about it, more supportive than DCL, which we've done, M3 has made alternate clients for DCL, but their strategy is just to basically ignore everything we've done. At least Ben has given tweets of support. Even he put in 0.69420th into OpenVoxels. Sick.

Nicholas: Yeah. That's interesting. So by taking these snapshots, hosting them on our weave, and then doing other events like this burn you're planning, the idea that you're sort of demonstrating that the community desires and is even willing to put money behind efforts towards OpenVoxels to encourage them to do something different? Or is it just enough to have your own kind of community and maybe that ends up somehow forking the crypto voxel space?

Jin: I want to design things to be positive. I want to create better examples of this narrative of an open Metaverse built on blockchain technology because I think NFTs only work in one platform isn't very a strong example for the whole ownership thesis. It's like buying a record that only works on one player and that manufacturer, that player, basically owns it in a way. So I just want to create.

Nicholas: Obviously you don't have real ownership if you need a rented browser. It's effectively like DRM built into the ability to even read the data or interact.

Jin: Yeah, we're seeing more of that too. Like more cloud streaming types. I mean, they all have their place. I don't think it would be great if that became ominous as so far as to a point where we're renting pixels and nothing. We don't really even own anything in a way.

Nicholas: Certainly seems to miss the point of a lot of what crypto allows. Of course, there will be centralized flow, whatever, dapper kind of things. Maybe DigiDagaku is in that neighborhood too, I'm not sure. But obviously you want to have at least a strong open source, accessible, modifiable universe also. And CryptoVoxels seems to have that. I know you mentioned one of the reasons that you like CryptoVoxels is because it's just easy to jump into and start building stuff, at least in the documentation. You mentioned that. So what is it that draws you to CryptoVoxels in particular?

Jin: I think that is something a lot of people also mention when it comes to what they love about CryptoVoxels. There's two things, how easy it is to build and how easy it is to share. And so I think CryptoVoxels can be an excellent on-ramp into Metaverse building. You can build your first home and presence in the Metaverse through there. But then I want you to be able to take that anywhere else and keep adding to it. And that also can go back into CryptoVoxels because it's iconic, whether wearables or parcels, you can recognize that aesthetic. And even if it's served as a base mesh, it just helps create structures faster, sooner than 3D modeling from scratch.

Nicholas: I see on the M3 GitHub that about a month ago in September, you posted a list of ideas around what kind of interop features would be interesting. So we talked a little bit about VRM import and export and alluded to the fact that those VRMs could be attached to NFTs so that if you connect your wallet and the Metaverse platform or protocol supports VRMs, they would show up. We talked a little bit, but I also see here you mentioned interoperable wearables. I guess because in CryptoVoxels in particular, wearables are like their own standard of 1155s. Is that right? And Vox. Sorry, say again.

Jin: And their .vox models, which not every platform supports.

Nicholas: I see. I see. Yeah, I'm thinking of the On-Chain Chain project by Rizzle.

Jin: Yeah, that one's dope.

Nicholas: Yeah, that one's dope. I have a token ID 420 of that one, which is a golden heart, I think. So and in that case, I guess they had to ship a separate NFT just to be able for it to work in CryptoVoxel. So it would have been cool if there was interoperability around those wearables with other worlds. Are there any other worlds in particular you would make use of that? I guess Webiverse would be one.

Jin: I always have my Voxel Katanas on my avatar, even if it's not like a CryptoVoxel's avatar. It just looks cool. I love the juxtaposition between a non-Voxel avatar and Voxel wearables. And that's one of the things, you know, that I think is going to serve like a positive. some outcome. vision of this is that right now, CryptoVoxel wearables are addressing a market size in the thousands. Whereas if you add interoperability with them, then you can serve a market of millions of avatars. And a lot of people, I think, would dig this aesthetic. You know, if you had like Voxel wearables on some of these VRChat avatars, it just looks cool.

Nicholas: I guess some of the resistance is both platforms, not CryptoVoxels in particular, but any of these metaverse platforms, want to have some kind of moat around the experiences that are created inside of them. But they would get the advantage of all of the other content that's created outside. Maybe the second reason would be if they charge a fee for creating wearables or selling wearables in their world, like in a Roblox style model, maybe they'd have to give up on some of those fees if people were able to use NFTs that they purchased elsewhere. Is that the kind of logic from their standpoint?

Jin: We want to do things in a way that can be lockstep with the community. And there's a lot of fear of the unknown when it comes to platforms and this moat. But you don't know until you really try. And I feel like LAN models are more susceptible to falling into that kind of moat slash wanting to lock in their users more so than avatar models, which is why you see a lot of avatar projects, both the interoperability aspects and the ability to bring your avatar with you across into all these platforms. Whereas that's not really an advertised feature when it comes to metaverse LAN plays. They're just not as portable. But I think that avatar introp is something that we could prototype. The CryptoVoxels avatar, the default one, is a mannequin. And mannequins are sort of like a metaphor for a placeholder. Anyways, and they're about to support VRM. It's already live in the client. It's just not a way for you to upload your own custom VRMs yet. But that's already a step that Voxels is taking to support this vision of interoperability. The next one is exporting VRM. And that list that you saw on GitHub was actually created by chatting with the Voxels team on just some ideas. Nothing is like really set in stone. It's just some suggestions on how we can align on some of these initiatives.

Nicholas: I see you also mentioned GunDB support. Actually, I'm here in Bogota right now with Mark Nadal, the founder of GunDB, as it happens. It's the first time we meet in person. What is the application of GunDB that you imagine would enable interoperability?

Jin: GunDB is real interesting. And I hear that Voxels has been experimenting with it for scripting, so that maybe your scripts can be peer-to-peer between parcels and users. And I just think it's a really interesting project to experiment with. And just the way that we can build a more decentralized and peer-to-peer metaverse, that's something that might be interesting to check out.

Nicholas: Yeah, I've had Mark on the show, and if I understand GunDB correctly, it's essentially like a JavaScript library that you can import into any website that you're creating. And it gives you the ability to connect to other peers in this Gun network to create a kind of mesh network to communicate with one another and duplicate data across local devices. So you can have a kind of decentralized communications network for supporting, sharing information that would be relevant to present inside of a website. Is that a fair summary of what's interesting about GunDB?

Jin: Yeah, yeah, it sounds good.

Nicholas: So it could be interesting for crypto voxels for enabling some, I guess, communication between peers in general? Right.

Jin: I think it's one of those things that it's an interesting item, but I'm thinking more when it comes to open voxels, more on the side of virtual production and showing stuff that people can really grok and understand. GunDB is one of those that when implemented, I mean, it's really cool tech, but I think it's going to go over most people's heads.

Nicholas: For sure, for sure. But it does seem to enable these kind of, it's not at all cryptocurrency, it's cryptography, but it does seem to enable for some people, these really large scale networks of like video clients and other kinds of things. It also allows you to do WebRTC from a client without having the need for a centralized server that WebRTC usually has. It would be interesting to see Metaverses adopt these things more so that we have less dependency on centralized servers. But you talk about virtual production as being your focus. Maybe you could share a little more on what that means to you or what for people who haven't filmed in the Metaverse or created productions, what that entails.

Jin: So VR and AI are both making motion capture and animation extremely accessible to the masses. And once you kind of like get your bearings with avatars and virtual worlds, like what do you do in these virtual worlds? And I think something that's underrated in terms of world design is taking cues from Hollywood in terms of like set design. One actor can be in many different roles in terms of costumes and with AI voice changers, many different types of characters as well. And so I see the future of work aligning closer with Hollywood and near future than office work and office meetings like what we kind of see in Horizon work rooms.

Nicholas: That's very interesting. In the Hollywood world, all the rage is these like Unreal powered sound stages like they used for the Mandalorian, right? So it's sort of these physical spaces that are simulated.

Jin: Yeah, they're using all these like realistic places and realistic avatars. And I guess they just want to tell stories from realistic sort of universes. But I want to tell stories from the Metaverse using Metaverse native characters and storylines and places.

Nicholas: That's dope. I see. Is it a boombox head in the screenshot we're looking at right now?

Jin: Yeah, boombox head under his wing learning about virtual production for a long time now, making music videos and whatnot. And this is some screenshots from the CryptoVoxels world in VRChat where we have camera systems and sound stages set up.

Nicholas: That's awesome. Is there a place where people can go check out like the end result of those productions or some of the experiments?

Jin: I am working on a event in a couple of weeks from now. It's going to be a side event to Raindance Immersive, a VR film fest in VR. If you Google Raindance Immersive or search it, you can find out more info there. But this event is going to be a premiere of a lot of virtual productions coming from the M3 community. It'll be in world, in your browser and in VRChat. So I would wait for that. It's going to be dank.

Nicholas: That sounds awesome. All this brings to mind. I work sometimes a little bit with Lexicon Devils, who are like a voxel architecture firm. It's been receiving a grant from JuiceboxDAO for quite a while to build this juicebox, what's now called the Juicebox Transit Center. I think previously it was the Education Hub or something like that. And they've been running these concerts every month. called Forming, which they now have a Twitter for, forming underscore underscore. And they're like bringing musicians into Web3 and specifically into Metaverses by having them record little videos. And then they do like a Twitch live stream, playing those videos spliced with sort of Eric Andre kind of Adult Swim kind of skits that they put on. And one thing that's come up in conversations with them is as those concerts start to take off and have a bigger audience. and really awesome music, frankly, is how to do the production of video in those spaces so that people who aren't there at the time can experience it later on social media. Is that like one of the primary modes you think of for this kind of digital production?

Jin: Yeah, for sure. And so we've made like music videos and shows in the

Nicholas: past

Jin: and nowadays and we're doing a lot of R&D pre-production type of work to see how we can better do performance capture, which is the art of just basically being able to record and replay all of the actors' emotions, sounds and emotes during a production and allowing us sort of like a lot of control and post and just the ability to relive those experiences and do all sorts of things in terms of lighting and camera work afterwards. So a lot of virtual production stuff also allows for ability to satellite experiences to other instances or Metaverse platforms. So I've got some examples in my Vimeo in which I'm able to take something in VR chat and then be able to export that into the browser through 360 depth map streaming. And so a lot of people who can't like jump into VR chat, they're still able to see it and not just like a 360 video, but also something that is they can sort of walk around in. It's a little bit laggy on my computer, but yeah.

Nicholas: So essentially you record the motion data of all the models in the space and then you can travel through it with a camera in post-production and reshoot the entire thing, maybe even relight the whole thing. But what's the difference about the 360 depth field versus just a 360 video for people that can move around in it, I guess is the difference.

Jin: Yeah. So it allows for you to at least move around in the space. But what you were saying before, like, so let's say we make a music video. We can then be able to infinitely remix that music video because we have all the data to reconstruct it. I'm saying with movies and shows and one of my dreams is to do a feature length based on CCO assets. So CCO characters and sets and scripts and all that based on public domain. And a lot of good stuff is based on public domain. You know, like classics, Alice in Wonderland, King Arthur, Cinderella, et cetera, all public domain in terms of the underlying script. And that's why I do a lot of M3 onboarding. talking about LUTs because I think LUTs holds a lot of wisdom and sort of unlocking some of the true potential for NFTs and virtual productions and game dev.

Nicholas: I definitely agree. Do you think the particular insight that it contains is more related to the fact that the on-chain metadata makes it composable and the licensing makes it accessible for people to do that composition or something else?

Jin: Those definitely. And I think that interoperability isn't just like the canonical form of your avatar wearable traveling between places. Perhaps it's the traits that persist between places that can also be stylized by the different platforms, especially as these AI content tools get better.

Nicholas: That's cool. So you can go into a kind of Candyland world and you have all of the NFTs you already have, but they have a sort of changed aesthetic or something like that. Yeah.

Jin: I sort of think of text as LOD zero. It's the metadata.

Nicholas: And on the topic of LUT, you mentioned CC0 projects. There is this kind of big dichotomy between LUT and nouns where nouns collects huge treasury for the minting, the auction minting and LUT collected nothing. I think now they collect like a small royalty on the collection.

Jin: Oh, they got some good royalties. I think it's around $750k to a mil. Yeah.

Nicholas: Okay.

Jin: And it's called a royalty DAO.

Nicholas: Oh, interesting. And that's like a 5% or something on the trades of the collection?

Jin: Yeah, something like that.

Nicholas: Nevertheless, it's a lot less than however much nouns has. And some people point to that as like being the difference between how vibrant those ecosystems are. Do you have thoughts on that?

Jin: Not really. I was never really interested in a financial. I thought of LUT as school supplies. So I bought some more LUTs and HyperLUTs. And I think of it as like here to learn and build and align with builder communities and people who are on the same wavelength as I.

Nicholas: I see you pulled up Vroid Hub for people who haven't played with it. What's Vroid Hub? and what are these avatars we're looking at? Is that a Vitalik one? Prince Vitalik one?

Jin: Yeah. So Vroid Hub is sort of like a place where you can upload and share VRM avatars. And there's millions of avatars that have been uploaded to this platform. We've been equipping. So Webverse, our team three modeled all of the LUTs items. And we've been working on avatar builder programs and just in general, equipping a bunch of different avatars with Metalute. And this kind of goes into the avatar interop group at M3, in which we're thinking about avatar interop, not just in terms of avatars traveling between places, but how wearables can adapt to different avatars too. Because I think of NFTs as kind of like a bundled LUT bag in a way, in which you can equip it onto any kind of base mesh. And that would be the ideal, not just like avatar interop with one base mesh out there. There's lots of shapes and sizes of avatars. So Metalute is our sort of ingredient that we're using to test with, because also it has corresponding on-chain text that we can also use. And, you know, the whole philosophy behind LUTs is very interesting from a interop perspective. And what you're seeing with the Metalute collection on Vroid Hub is a whole bunch of avatars that have been pre-configured with LUT. Shout out to Semmerroy for helping with most of these. And we are just going to work on some little virtual production snippets. I've got all these uploaded into VRChat as well, so we can record little clips of avatar interop as a group, not just as an individual, you know, traveling between platforms. Because that's how I've been operating for years, which has been successful for spreading adoption of open standards like VRM. But I think it would be way more epic if we had a group of people with the same kind of avatar, but tied all together with a wearables collection traveling between many different platforms.

Nicholas: Definitely. It's frankly crazy that you can't do that, that they've really bastardized the term metaverse profoundly, but that not being sort of table stakes for what this all means.

Jin: Yeah, the whole metaverse and now open metaverse is getting watered down and appropriated by folks. And I just want to kind of do more showing than talking.

Nicholas: Definitely. So this Metalute project is like 3D representations of the items in the LUT bags?

Jin: Correct. Yeah.

Nicholas: And then you can equip them. So if you have a LUT bag, you can equip them or distill them or something from the NFT and then add them to your VRM profile. We're looking at Prince Vitalik wearing LUT. It's pretty cool, honestly.

Jin: Yeah, we've been working on this for a while. We've been, we're like 80% there. The other things that are coming out with this and my Twitter is freaking out right now in terms of spaces. So if I drop out, that's why. But a really magical minting slash claiming experience where it kind of feels like you found a secret treasure room on the internet. And then you can claim not only your LUT bag, but also your synthetic LUT. And that's also what draws me to LUT is the synthetic LUT contract is really cool. Every Ethereum wallet getting their own LUT bag. That kind of makes me think like when onboarding in the future, it's kind of like creating a character in an MMO. You create a character and then you can see some of the wearables that you have automatically.

Nicholas: For people who don't have context, synthetic LUT works kind of like blockies, which are those little pixelated representations of your wallet address. So you can differentiate wallets in your MetaMask or whatever wallet you use. Little images generated based on the public address of that wallet. Synthetic LUT does a similar thing, but generates a LUT bag based on your wallet address. So there's no paying. There's not even really, I suppose, is there technically an NFT if you read the token URI? Everyone owns their own NFT?

Jin: It's called a virtual NFT. You can't trade it. It's a sold out token basically.

Nicholas: Right. Pretty cool. But basically ERC-721 standards compliant, but it's just like generated by the existence of the wallet implies the existence of the synthetic NFT or the virtual. Right. Yeah.

Jin: And right now the only interface so far that I've seen. I mean, one of the things also that inspired Metalutes is that there's not really any good game dev assets to experiment with. Like I love the concept of LUT, but we need a visualizer in which we can use these things, not just as avatar wearables, but it also be cool as like item drops, you know, like we have some ideas in which you can claim like a chest that contains your Metalute. And maybe down the line you could choose to, you know, you'll have a choice. Either you claim it for your own heroes or you sort of choose the dark side per se. And it can become like wearables for a NPC mob that's aggro to protect your lands or something. Cool.

Nicholas: That's a great idea. And you would be doing that with the Metalute you were saying?

Jin: Correct. Yeah.

Nicholas: Okay. So actually tradable NFT is not virtual synthetic ones.

Jin: Yeah, that would be one of the tradable ones.

Nicholas: So is that the kind of project? if people are looking at OpenVoxels and they're thinking, I don't know why I would support this. Is that the kind of project that OpenVoxels is funding in one way or another? Or if not, what are the specific things OpenVoxels intends to fund?

Jin: OpenVoxels has a Dwork already. And so you can kind of see some of the things that we're working on. We're working on a parcel minter so you can take a snapshot of your parcel minted as a 3D NFT. We're working on guides for how to, once you export your avatar or your world, how to bring it into other 3D editing environments and add to it. And how to bring it into other places. And we're also working on a Metaverse Burn event in a couple of months to kind of showcase a lot of other ideas together and do a build-a-thon. So doing a snapshot of the entire world in a couple of months on the third anniversary of the last snapshot, exporting that and basically building alternate reality on top of it. We want to become a lab. that's not just for interoperability like experiments, but also for art projects as well. And creating different visions of crypto voxels as unlocked. Because with interoperability you can unlock the strengths of the platforms which you import these assets into rather than one platform feeling spread out trying to incorporate and design features on every front to compete. Trying to find ways to compose.

Nicholas: Yeah, I saw you, some demos of like a video game arcade machine and other metaverses that would sort of let you peek into a crypto voxels space.

Jin: Yeah, we've got arcade machines. Actually this intro vid is probably the best place to start. It was actually uploaded three years ago exactly, almost. And it showcases many of the early crypto voxels interop experiments from virtual production to vending machines and arcade machines and different platforms and going from one platform to the next. Yeah, this video had a lot of ideas that I'm still working on. I just, it kind of sucked that, put a lot of work into these experiments and they just kind of like don't get past the proof of concept stage, you know, and just how awesome it is. Where you could, for example, imagine bringing crypto voxels into Minecraft and then adding some of the realistic shaders on top. It would look so cool. Totally. But yeah, we can't just expect to do all that kind of stuff pro bono forever.

Nicholas: So the idea is that with funding, OpenVoxels creators who are these like decentralized contributors, and I suppose it's somewhat open to people, are able to build out some of these ideas and make them like more or less production ready.

Jin: Yeah, we're going to provide assets for people to use in our projects. I've already begun uploading snapshots of crypto voxels from 2019 to Sketchfab and we'll be doing some collections around that in the future as well. So making it easy for people to remix and participate in it, especially that's the whole point of The Burn, right? The Burn is like a whole city of contributors coming to collaborate and build for seven days, right? It's a build-a-thon. So we want to align with other metaverse makers from different corners and serve like a metaverse potluck, bring our ingredients and dishes to the table and create something greater than a sum of its parts.

Nicholas: That's awesome. I'm really excited about that. I hope I'll be able to attend that. Will there be tickets for that or will it be open to anybody?

Jin: I haven't quite figured it out. Burning Man does require tickets. So I think supporting open voxels is a guarantee. I want people to have skin in the game. You have serious skin in the game when you're stuck in a desert for a week in that kind of environment. Especially the web, it's so easy to context switch that you don't really feel too committed to anything because you can just change tabs and whatever. So I am thinking about ways to substitute that in a digital way.

Nicholas: If anyone in the audience wants to come up, I know I see Arashi I think is one of the contributors at OpenVoxels.

Jin: Arashi's been doing a lot of great work in general with M3.

Nicholas: Awesome.

Jin: Something we never mentioned was just landmarks too. The passing of a lot of money and just realizing a lot of his great works of art are in CryptoVoxels. I want to preserve those things for eternity. I'm also thinking about that. How do we preserve these metaverse monuments?

Nicholas: Is the primary concern that CryptoVoxels servers eventually go down and the stuff is lost? Or is there an erosion of the art over time for technical reasons also?

Jin: It's a little bit of both. We can't just rely on CryptoVoxels to have it up forever. The data is not truly decentralized. We can download the JSONs for our parcels and the world as well, but if we don't have any other place to read it, then that data is kind of useless.

Nicholas: You mentioned also that you are less interested in these realistic virtual spaces like Mandalorian generates for the backgrounds for their characters. What does it mean to you, the opposite of that? Something that's truly metaverse native or virtual native?

Jin: I would look up Boomboxhead's music videos up on YouTube where he's got a whole crew of people in VRChat. Or look up We Met in Virtual Reality, for example. It just got released on HBO as well. It's a future link that was shot entirely in VRChat. NeonBiddle has just started. This is one of the groups that we're working on the metaverse burn with. If anyone wants to join, the link is hyperfi.io. If you want to continue the conversation, let's hop into HyperFi and we can talk about and do some world hopping directly related to all this stuff that we're chatting about.

Nicholas: Awesome, let's do it. Before we go, just for the podcast version, where should people find you and where should they find OpenVoxels?

Jin: Check me out at DankVR on Twitter and at OpenVoxels on Twitter as well. You'll see the timeline of everything we're working on there.

Nicholas: Awesome, Jin. Thanks so much and see you in HyperFi right now.

Jin: Yep, see you all there.

Nicholas: Cool, thanks everyone for coming to listen and see you next Friday for another episode of Galaxy Brain.

Jin: Cheers.

Nicholas: Alright, see ya. Web3 Galaxy Brain airs live most Friday afternoons at 5pm ET, 2200 UTC on Twitter Spaces. I look forward to seeing you there.

Show less

Related episodes

Podcast Thumbnail

TheDefiant and BasedAF with Robin Schmidt

28 October 2022
Podcast Thumbnail

Nick Hollins and Ivano Salonia of UFO

26 December 2023
Podcast Thumbnail

Webaverse and the Dream of an Open and Composable Metaverse with Avaer and Jin

1 April 2022
OpenVoxels with Jin (@dankvr)