Sitemap

TRANSCRIPT: RNP-021 & The Future of 3D Creation - A Chat with Divesh Naidoo on Simulon

25 min readOct 29, 2025
Press enter or click to view image in full size

Monday’s Render Network Spaces on X featured Divesh Naidoo, founder of Simulon, for a deep discussion on the evolving landscape of 3D creation and invisible VFX workflows so seamless they feel human-led and intuitive from concept to render. Divesh shared insights on Simulon’s official launch, its mission to unify fragmented 3D processes, and the potential for future integration with the Render Network to empower creators at scale.

The Render Network Foundation also joined to unpack RNP-021, the latest proposal expanding the network with enterprise-grade GPUs, a dynamic reward system, and a roadmap toward decentralized compute that supports both creative and AI workloads. Together, these discussions highlight Render Network’s broader mission: building the decentralized infrastructure powering the next generation of creators, tools, and experiences.

Listen to the full recording as you read through the transcript.

Sunny (Render Network Foundation) — 00:12

Hello.

00:17

I think somehow that was muted. yeah, invisible VFX.

Divesh (Simulon) -

Yes, so I was saying that invisible VFX um basically refers to VFX that’s so well integrated that you wouldn’t even know it was there. Yeah. And, you know, our thinking is what if that applied not just to the end result, but the process itself. So, you know, the process is so well integrated and so seamless.

00:45

that it feels invisible to the creator. And so it’s not just at the point of consumption, but at the point of creation itself. You know, can we build CDVFX tools that feel very intuitive and very human? And I think, you know, CDVFX hasn’t evolved for a very, very long time. That’s just because it’s a very difficult, complicated process, you know, it’s been.

01:13

fragmented sort of multi-tool, multi-skill workflow. And we’ve essentially taken on the challenge of unifying that whole process to create a seamless end-to-end experience.

Sunny (Render Network Foundation) -

Yeah, that is a very good summary. And for those of you who want to delve slightly deeper, you can see a tweet in the pinned section here on this space. And yeah, there is a really good demonstration there as well. It looks…

01:39

very seamless, looked very, very realistic and, you know, dare I say, as good as any traditional render.

01:50

Yeah, and I think the important thing to probably mention is also doing all of that um without sacrificing the human intention, which I think is really key. I think it’s the only thing that matters actually is that a human is able to still direct the process and get something that they actually intended at the end. Yeah, absolutely. I think that is definitely a huge focus here at Render and I know our friends over at OTOY.

02:19

It’s always about the human-led, the artist-led vision and seeing that vision through to the end. Yeah, so far from what I can see of Simulant, it looks absolutely fantastic. I know many of our community are very keen on knowing whether there’s ever going to be any kind of integration with the render network. I know that kind of thing is potentially further down the line.

02:49

Yeah, yes, to achieve this, there’s obviously multiple abstractions to create this sort of seamless end-to-end experience. And one of the big ones is rendering. And at the moment, we’ve got a distributed cloud render service that’s able to scale up on demand to meet all the creators’ jobs. And I think we’ve had some interesting discussions around this and what the future of this looks like.

Divesh (Simulon) — 03:18

Because as we create more demand, you ideally want an economy that’s able to develop around something like this. And I think the use cases are just scaling up. More and more people are becoming creators. Tools like ours and many other creative tools are just making stuff more accessible to more and more people. And I think the Render Network has a very unique and probably

03:48

the only approach that I’ve seen with decentralized cloud compute and being able to access rendering in a way that’s very, very intuitive. So we definitely have plans in the future to explore much deeper integration.

Sunny (Render Network Foundation) —
Yeah, that sounds amazing. And I can only imagine when this is out of beta and when people actually get their hands on it and start producing things.

04:17

it’s just gonna snowball because of the ease of use, I think. Yeah, we are actually out of beta now, so we launched officially a week ago. Oh, well, there we go. You know what, let me get on that. I’ve still got the beta app, so maybe I need to get rid of that and re-download. Okay, excellent. Well, I will definitely get on that.

04:44

Yeah, amazing. Are there any kind of tips or tricks that you’d like to share with the community?

Divesh (Simulon) — 04:50

I think, you know, we are going to, to be honest, one of the things that we haven’t done amazingly well yet is just showcasing the how. There’s a lot that’s possible with the tool. We’ve built in a lot of features and a lot of ways to achieve different and really interesting results. I think that’s going to be a core focus from us, you know, from this week forward, this coming week.

05:17

ah is just doing a lot more deep dives, tutorials, guides. ah There’s a lot of really cool things that you can do. And I think it doesn’t really matter what you’re trying to achieve. If you’re trying to visualize digital assets in the physical world, that’s really easy to do. If you’re trying to make animated content, story-driven content, that’s really easy. If you want to integrate into

05:45

an AI workflow just gives you a lot more control over the end results. That’s also, you know, there’s a very unique process, a unique workflow there. So yeah, I think we’re gonna be spending the next two weeks and sort of indefinitely just highlighting how all of these things are to be approached with Simulon. Excellent.

Sunny (Render Network Foundation) — 06:13

Yeah, I can’t wait to see more and learn more about Simulon. Brilliant. I think Phil might have a couple of questions.

Phillip (OTOY) —
Yeah, yeah, I had a quick question. Well, first of all, it’s been amazing to see some of what folks in the Octane and even Render community have already been creating on Simulon. like, Smearballzl’s had this cool AR hippo

06:41

thing that was just amazing and Voidz, also a big friend of the Render Network. You know, one of the big thesis is that we have like at Render and at OTOY is that it’s just this huge democratization happening in 3D and, you know, looking out to five years from now, potentially a world where hundreds of millions of people are creating in 3D for mobile apps, you know, from their

07:11

know, mobile devices rendering at the push of a button. What’s your view on kind of where things are heading with consumer 3D and, you know, how do you see Simulon fitting in and this whole like kind of democratization movement that’s happening?

Divesh (Simulon) -
Sure. Yeah, so I think 3D is definitely the best…

07:36

intuitive and natural way to create for humans just because the world is 3d, you know, and we live in a 3d world. And I think, you know, at the moment, we’re seeing a lot of really interesting things happening. Obviously, Jenny, I, you know, has taken a lot of the public eye and perception and conversation right now. And it’s been focused a lot on on, you know, generating 2d pixels and a 2d

08:04

workflow, but I think those things are going to start to converge. ah And there are some really interesting things that we’re doing in R &D. And again, we won’t really release any features or tools if they’re at the expense of human intention again, because that’s just our core values. So we are very excited about this convergence of SPD and GNI. And I think that’s going to lead to some incredibly, incredibly exciting tools for

08:33

for creators in very near term. And yeah, I would say you’re 100 % right that tools like ours, it’s gonna allow people to create at the speed of social media and close to sort of the speed of thought, which was just never possible before. We’ve had a lot of creators telling us, since Simulon launched that.

08:58

It’s amazing to sort of think up an idea or have a bank of 3D assets or even access assets from other artists uh and bring them into the world and create something and have final renderings already in minutes. uh And that means, you know, just sort of create very relevant events and things that are happening in real time as opposed to the typical sort of week, multi-week long visual effects process.

09:28

Yeah, I think it’s going to be very exciting times coming up and redefining what storytelling and creativity really is within sort of 3D and these AI workflows.

Phillip (OTOY) -

Amazing, amazing. Yeah, it’s going to be cool for everyone to start creating “Everydays” just kind of like Beeple does. I think you touched on this briefly, which was a little unexpected, but

09:56

You almost sort of mentioned that you want what you’re doing to protect uh creators and protect uh artistic authenticity. What are your thoughts about sort of bigger picture questions like provenance and verification of creators’ uh works in social media?

10:24

even like the ability to remix other people’s works, which I think you sort of loosely touched on. Those are some of the big picture things that we’re also thinking about as well.

Divesh (Simulon) -

You know, I mean, it’s such a philosophical gray area, I guess. I actually recently shared, uh my wife is in space, Elsa, I actually recently shared an article that I came across where…

10:53

She’s the seventh most prompted photographer on my journey. And it’s interesting because she actually welcomes it and she finds it quite interesting and different and an evolution of things and some artists or photographers, you know, might have the total opposite feeling towards it, which is also justified. It’s like, cool, my, you know, my stuff is a part of

11:22

This is what makes this thing special. um And I’m not really, uh you know, rewarded or anything for that. So I think it really depends on the individual’s perspective. But the fact is that technology is evolving. And I think people should definitely look at, you know, the pros and where this is gonna go and what is going to enable them to do as creators, because I don’t think

11:51

any of this stuff is gonna stop anytime soon. uh And it’s quite exciting, personally for me. I would say though that protecting artists rights and with 3D is a lot easier as well, where you can ensure that the sort of provenance is built into the creation, the entire process or pipeline. And that’s something we do at Simulon we track.

12:20

assets that are being used to produce a final result. And we can sort of automatically credit everyone that’s involved in the process. And I think this creating collaboratively is just, it’s more fun, it’s more rewarding. And you also end up with an end result that multiple people are proud of and were part of. I think that’s part of what makes it so special, you know, and we shouldn’t really lose that aspect. Of course, it’s amazing to be an indie creator and be empowered.

12:49

But yeah, that’s my thinking. I think more so from the creator’s perspective and intention. I touched on that a little bit in the beginning and on some posts that I made. I think that part is very, very important and making sure that the process is still enjoyable because I think I mentioned something like this to Elsa recently where if you… oh

13:19

We actually became um parents recently and we were thinking about if you sort of take a child to the beach and build a sandcastle, you know, wouldn’t have a child sort of instructing you and then have a sandcastle appear. It’s very different from going through the process and feeling how enjoyable it is to get fulfilled and rewarded.

13:47

through the process of creation, which I think is sort of a fundamental human desire and need, in my opinion. So I think abstraction is good, but not to the point of abstracting away intention itself.

Phillip (OTOY) — 14:03

Amazing, amazing. I know uh Spencer um has done a deep dive on Simulon this weekend. So maybe Spencer, if you want to jump in with any questions or thoughts.

14:21

actually, so I realized he’s a listener. no, that was amazing. I guess that was all, those were the big questions I had. But we’re such huge fans of what you’re developing and excited to hear some of your thoughts about leveraging the Render Network as you scale up.

Spenser (OTOY) -

Yeah, I’m on here now. Yeah, I’ve got it downloaded and I’ve got some scenes I’m ready to just jump in with on it. I haven’t actually had a chance to dive in.

14:50

You know, too deep yet. I was excited for this call just to hear a little bit more about it. But yeah, everything I’ve been seeing so far has just been awesome. You know, I’m going to have a ton to post here shortly this week to show what I’ve been kind of working with. But it’s very, very cool for sure.

Divesh (Simulon) -

Awesome. That’s exciting to hear. Feel free to uh message me at any point if you need some clarity. Like I said, we haven’t actually shared a lot of what’s possible.

15:20

So the best practices are so happy to share with you. Absolutely.

Sunny (Render Network Foundation) -

Thank you. Amazing. Well, it’s clear that you’ve already built a fan base amongst the render community

I think it’s well aligned Yeah, absolutely. Absolutely. Well, thank you so much Divesh for joining. I’m sure we will speak to you in the future as well. I look forward to that. I will let you get back to sleep

15:50

Awesome, thanks. No worries. Take care. Thanks. Cheers. Amazing. Yeah, that was awesome. It’s always good when we get people who are part of the builder economy, right? They’re building these tools for people to use for artists to really take advantage of and not replace artists, which is a huge point of contention.

16:16

with all of this AI future that we’re headed towards. So always having tools that improve the experience and like I said, not taking away that building. People can get lost in Cinema 4D for hours just playing around with things, seeing how different lighting looks and seeing how different assets look. And I know I do from time to time and it’s a cool thing. You get in that state of flow. uh

16:44

And I don’t want that being taken away. So having tools like this, you know, is going to be only a good thing. Right? Speaking of decentralized computers, we also have Tristan on this call. Tristan, how are you?

Tristan (Render Network Foundation) — 17:03

Hey, Sonny, I’m well, thanks. How are you? Amazing. I’m good. I’m very good. Thank you. Thank you so much for asking. Yeah, so you know what? Let’s give everyone a quick overview as to who you are because you know what? You’re not on the spaces all that regularly. We know each other very well. But just an overview of who you are and what we’re going to be talking about today.

17:30

Yeah, thanks, Sonny. So I am head of operations of the Render Network Foundation, which is based in the Cayman Islands. um And I am also based in the Cayman Islands. So that keeps me involved in a fairly large number of elements relating to the foundation operations. um

18:00

many of the dry things like negotiating and papering contracts and ensuring that the accounting is all carried out to touch on some of the more exciting stuff like being able to be involved in exciting projects like the Artechouse’s project that was recently launched and I think many of the listeners will have heard so much about that. oh

18:30

I actually haven’t seen it yet because we also had a baby last month and so I was off for that. But I am going to be seeing it in December when I pass through New York and am super excited for it.

Sunny (Render Network Foundation) -

You are in for a treat Tristan. It is spectacular. There are some pieces that I just, it’s indescribable when you’re sat there or just.

18:59

watching, seeing these pieces, just the amount of time and effort you know that these artists have into this. It’s amazing.

Tristan (Render Network Foundation) -

Fantastic. I’m super, super excited. And I think you’ve just come off seeing it. So you’re fresh with the experience.

Sunny (Render Network Foundation) -

Yeah, absolutely. So I saw it yesterday. You know what? Time has just gone by. um

19:28

But I also, along with Edgar, interviewed MHX, one of the artists who produced one of the pieces. So we interviewed him at Artek House, so inside the venue with his piece playing in the background. And let me tell you, it just felt so good to connect with MHX again and to be surrounded by his work. Because it’s one of those intricate pieces which…

19:55

But let me give you this little tidbit, So in the interview, we asked how long this took to render and how long it would have taken locally, so on his own machine. And he gave us a figure of, I think it was, it would be 26 months, so like over two years to render on his local machine, two years, Tristan, and on the render network, it took uh around a week.

20:23

And that week also consisted of many test renders and different iterations. And there we are. So that’s the kind of thing the Render Network does, which is absolutely phenomenal. I love hearing these stories. And it’s an honor to be a part of keeping this kind of wonderful product available. Absolutely. And talking about available products, we recently…

20:53

put out a draft for RNP-021, so RNP-021. Would you like to tell us a little bit about what that is and how that’s going to help the render network?

Tristan (Render Network Foundation) -

Yes. um So, really interesting topic and the community’s been fantastic at giving feedback, both on typographical matters as well as

21:22

ideas around how something like this might operate. So to give a little context, the render network remains primarily focused on artists and their global network of consumer GPUs. That is still the major traffic uh on the render network.

21:52

2019 introduced the ability to use consumer-grade GPUs for a wider range of compute tasks and There are some overlapping um Applications which I will see if I can do justice some are very closely related to the motion graphics side of things and that is where

22:23

For example, a motion graphics artist starts to integrate an AI tool into their workflow and that AI tool requires some computation. That is different from the computer required when one does a project on Octane and directly renders it on the render network. so we now have this

22:52

this network in Elsa State where we have a number of consumer GPUs onboarded and the dev team are working very hard to get the orchestration layers established to handle workloads between machines and collaborating closely, of course, with the uh OTOY team that is doing some exciting things with

23:21

OTOY.ai

23:25

And also at the same time with the Render Labs team who have been on the spaces from time to time and the products that they’re developing to use the network. And so from that, there are some learnings such as the leading image and video models cannot yet be run on consumer nodes. Now, there has been a huge amount of progress.

23:54

in the last, I want to pick a date, but it almost doesn’t matter, let’s say the last six to eight months, but also in the last 18 months, in terms of what can be achieved both on consumer nodes and consumer nodes that are networked in a decentralized fashion. And when we were talking about this, you know, roughly a year ago, uh LLM training wasn’t really something possible on a decentralized network.

24:23

and we were targeting more on the inference side where models are put to use.

24:30

And then a few months ago, somebody solved the decentralized problem. And so now there are potentially many opportunities to even train LLMs on a decentralized model. And all of these start to become available as the computer side scales. So there’s a very close tie with the history of the network in terms of its motion graphics.

25:00

focus. And I guess we’re throwing in a little nod to the progress recently made by the render network in terms of VFX workflows, Sunny. And I think you’ve probably talked about this. um But we think also really interesting opportunity there for taking on VFX workflows on the render network that um

25:29

Also, previously, we’re more limited to CPUs with significantly larger RAM than our standard GPUs. But getting back to the computer discussion, along with what we’re seeing as tools that would be useful on the graphics rendering side, the macro market opportunities are in it.

25:58

You know, depending on which study you look at, the GPU market is projected to grow from roughly 83 billion, depending who you talk to, in 2025, to approximately 350 billion by 2030, which is, for illustration’s sake, a compound annual growth rate of 33%, which is significant.

26:29

You know, in the realms of speculation, when you see things like the Elon Musk projects building terawatt data centers, it’s tempting to think that maybe these projections are conservative. So essentially, with all the growing demand for compute, it seems opportune to take our base

26:59

of building a decentralized network, our knowledge of graphics and our ability to network decentralized compute and expand it to include the opportunity for node operators to add enterprise-grade GPUs. And this is sort of…

27:23

one step of the RNP, the render network proposal, that is a step towards bringing on the enterprise compute. And I know I’ve talked quite a lot, so I just want to pause for a second and um check in before moving on to the next topic.

Sunny (Render Network Foundation) -

Yeah, I’ll let you get a glass of water as well, um because I know when I’m rambling, well…

27:52

tend to ramble on and I tend to not take breaks on spaces and I definitely need a glass of water after talking for longer than a minute solid. uh So yeah, take a breather. I think that was some really good information there. uh The enterprise GPUs.

28:12

is a huge unlock for the compute network. I think it’s H200s, H100s, this level of GPU with, I think, 80 gigabytes of RAM upwards. And these primarily are going to be used for AI and machine learning. And hopefully, tied into the compute network that we have going on. So that would be great.

28:41

I think the community did have a couple questions. If anyone here from the community would like to ask any kind of question regarding this RNP 21, feel free to drop a comment and we can absolutely take a look at the comments at some point in this call. ah

29:00

because another reason for these spaces is just to touch base with the community. So any kind of questions or feedback, anything like that, we’re able to speak to those in real time because often in Telegram and Discord and things like that, things easily get lost. So yeah, absolutely share any questions and stuff that you may have.

Tristan (Render Network Foundation) -

Yeah, yeah, absolutely, Sonny. So I will.

29:29

Jump in here and thanks for the break. had my sip of water. Okay, good. I do want to, you know, mention that I do still plan on hitting on rewards and job allocation, which I know are big questions. So guarantee those are on the agenda.

29:50

take questions on but let’s say the first part of the discussion which is the focus on the traditional rendering side, the continuation of all reference to artists and the macro market opportunities.

30:09

see if we get any questions.

Sunny (Render Network Foundation) -

I mean, there could be some I know that Phil has been keeping a close ear to the community as well. Also Sylvia has. So at any point if you guys want to come up with a question which is from the community, feel free to jump in. I’m more than welcome to do that. But if there’s anything on the top of your mind Tristan, yeah, feel free. Yeah, let me crack on. oh

30:39

on to rewards. So there are a couple of things worthy of mention. This RNP-021 initially maintains RNP-019’s core mechanics for job allocation and rewards. There are two changes. One is to increase the baseline job rewards.

31:05

for an RTX 4090 from 10 render to 25 render per epoch, assuming 100 % utilization. So the mechanics initially will stay the same. But secondly, it introduces a path towards a dynamic auction reward system that allows compute nodes to set their own reward and…

31:34

This also helps us solve the path towards scaling the network. So this is quite a significant change and I’m currently working on the next draft which hopefully we can share tomorrow or Wednesday that will include some more wording around this. um

32:01

But that is a key message worth sharing. Go ahead, did you have something? Yeah, no, I was just gonna say that. That is one of the aspects I believe the community was querying and wanting to learn more about. So yeah, thank you for mentioning that. Absolutely.

32:30

One of the other questions that came up was around demand. So along with the macro indicators, we have seen one of the ecosystem partners that we all know, OTOY.

32:48

Committing m to you the compute cluster for their upcoming OTOY.ai platform. And that is designed to allow effortless AI image and video generation. So early days still and… em

33:14

believe it to them to give any further details on that but certainly an interesting proposition.

33:25

Yeah, I think with OTOY.ai, it will initially launch separately from the render network. And then once models are, I guess, determined best to use decentralized, then those have the potential to be added to the render network. So those would then run off the compute network. I think that is the goal.

Sunny (Render Network Foundation) — 33:52

As far as I’ve kept up to date, I’m sure we’ll learn a lot more about that with announcements from OTOY directly. I do believe there will be a beta for people to test as well, but obviously more on that later today. 100% I’m super excited to watch that one.

Tristan (Render Network Foundation) -

34:19

Another one of the topics that came up a lot in the questions was job allocation. And this has shifted slightly with the introduction of a move towards a dynamic auction system. In RNP-019, we talked about a rotational logic whereby

34:46

The user of the network would select the hardware that they wanted if they had a preference and then jobs would be allocated based on nodes being up and a rotational job allocation system. the last person to get a job would be, you know, or the person who received a job service in the past would move to the top of the queue. That’s the intention.

35:16

With the shift to a dynamic auction system, pricing would come into play.

35:28

I think it’s still relatively early in sort of putting this one down. It looks like though it would be something where customers would have the option to choose hardware if they had a preference or if it was just general compute, the network would be able to allocate more broadly and then select based on price.

35:53

potentially a note reputation and of course the note being available at the time that the job is initiated.

36:06

So then the next fairly major item in RNP-021 is opening up the potential to dialogue with, work with significant users of compute, more significant and in a different way than traditional rendering. And those would be potential users who might look to blockbook.

36:36

GPU processing time for a period of days, weeks, months, and lock in a rate. And again, going back to the macro environment that we find ourselves in with GPUs being in demand, and seeing competitors in the space do this, we are looking to…

37:02

and allow the render network by the foundation in initial terms anyway to accept those customer orders and then go to market and procure on a rental basis, GPU compute time to match that customer demand. So this is a significant shift on the current model.

37:33

and in some ways leans quite nicely into the dynamic pricing model or the dynamic auction where a node could bid for that work. So in the initial draft of RNP-021, we talked about the foundation going and securing that kind of compute.

38:00

I think it has evolved since and it certainly is the preference if we can find a way to make this work at the node operator level that node operators are able to fill that demand. And it might be a case that it starts off manually where the foundation for example goes and finds a provider to match and then.

38:30

it could evolve into something more automated where the nodes would either provide that demand or they would go and procure that supply themselves. Sorry, I said demand, that meant supply. So that in an initially manual way if the foundation would negotiate block supply of GPUs, that supplier would.

38:59

be a node effectively. And in version one, they would be rewarded on traditional commercial terms based on market availability. So the render network would pay that GPU provider directly and for 100 % of their time.

39:27

and then sell 100 % of that time on to the render network customer.

39:35

So in addition to that, there may well be an element of attracting customers to that type of compute, in which case that would be covered by grants from the grants emissions pool. And this all really fits in very much a kind of Elsa state.

40:04

discovery phase of seeking out customers. And one of the big reasons for adding this to the RNP is to determine if there is demand in the market that the render network could look to fulfill in a meaningful way.

Silvia (Render Network Foundation) — 40:27

Hey Tristan, Yeah, hey, how are you? Thank you for explaining that. So many questions have come up around this.

40:40

Thank you.

40:43

of compute time makes a foundation, an owner of

40:52

answered um in a few different channels, but do you want to touch on how actually that’s not the case? Yeah, 100%.

Tristan (Render Network Foundation) -

41:04

Apologies for my choice of wording there. Procure in the context means procure GPU time. So the foundation would not own the hardware. And I saw another question. If OTOY would be providing GPUs, I don’t believe that is currently the plan. It’s certainly not part of the intention of this RNP.

41:34

Of course it’s an open market and if they chose to pursue that I think we would listen but it would certainly have a different lens on it.

Silvia (Render Network Foundation) -

41:53

Sorry, I’m just seeing questions pinging in as well. So yeah, we’ll clarify in version two of the draft that procure does not refer to the acquisition of hardware. The foundation doesn’t intend to own or operate nodes in that sense, other than some test net nodes.

42:20

which the foundation operates for purposes of trialing out the network. And some of the potential compute customers have used that test net to explore how they could orchestrate their jobs on a distributed GPU network with their various Docker images.

42:47

Sounds good, a few questions coming in.

42:55

So I’m happy to address.

42:59

I’ll start with Toby. for sending through your questions. When will render payments be integrated to the network? This is not necessarily related to the compute network, but I’m happy to answer it real quick and then we can come back to compute. Let’s just give Tristan a moment to have another drink of water. So when will render payments be integrated to the network? I think we’ve shared previously, so Luke and Mathis have done a nice job over the past few weeks and months.

43:28

answering a few of these questions from the community. It is on our roadmap. We have intentions of making sure that we can enable payments via the render token.

43:45

fairly busy roadmap, a fairly busy developer team. They’ve been heads down building the compute network.

43:54

The team in particular has been quite busy working.

44:02

to the community before the end of the year.

44:10

forgotten about it. is still on our roadmap. So thank you for your patience. We’ll obviously update

44:22

Let’s see, another question that came in also from Toby. It says, you for Tristan, I’m happy to maybe start off or leave it for Tristan to answer. Can we get a timeline update on the computer network? What has been the biggest challenge getting it ready? I’m happy to kick off just because I’ve been involved working with the team here. I think we can necessarily think about it as what is the single biggest challenge rolling out the computer network.

44:50

into context a little bit and say that, you know, what is being built here is an entire new network, essentially from scratch, where we are building most of the marketplace, if you will. So you’ve got the GPU node operators on one side and you’ve got customer demand on the other side. And it does take some time to put something, to stand up something like this that can handle.

45:16

the kinds of jobs that we intend to handle, real use cases or real compute power. And as Tristan alluded to earlier, there has just been such a rapid evolution of the types of work that businesses and developers and creators are able to now put to decentralized GPU networks that even just a few months ago wasn’t.

45:46

you know, the consideration as we continue to build out the network. But it has been a steady, you know, effort that does require developers, you know, building, testing.

46:05

you

46:16

And of course how the RNP-021 contributes to that and adds to what RNP-019 has already resulted in. So I didn’t really have anything to

46:36

or any other.

46:43

Yeah, nothing from me. Okay, excellent. If there’s no other questions that are to be answered, I think we’ve kind of covered everything else. Oh, Tristan is off mute now. Hey, Tristan. Yeah, apologies for that. Can you hear me? Yeah, loud and clear.

Tristan (Render Network Foundation) — 47:07

Okay, super. Yeah, I was happily chiming away that Sylvia had done a fantastic job of covering both those topics. You know, one of the things that just keeps coming back is that this opportunity around general compute is so fast moving and so significant that focusing

47:36

You know, the lion’s share of resources on getting this one right continues to make an attractive sense.

Sunny (Render Network Foundation) -

47:59

Yeah, absolutely. And I think that is pretty much everything for this space, guys. So thank you so much again, Divesh. I know he’s offline now. It is nearly 2 a.m. where he’s located in South Africa right now. So hopefully he can get some rest before the baby wakes him up. And Tristan, yourself as well.

48:26

Yeah, thank you so much for joining us. It’s been quite a hefty topic to digest, but we absolutely look forward to the iteration or the next RNP-021 for everyone to hopefully evolve atop. And then we can get some enterprise GPUs on the Render Network for some of these hardcore, dedicated AI jobs.

Tristan (Render Network Foundation) -
48:53

Thanks so much, Sunny. I appreciate you sticking it out to the wee hours. Oh, no worries. All right, thanks everybody for joining and we will absolutely see you again next week, most likely at the usual time of 7 p.m. UK. Until next time. Bye. See you later, Sylvia.

--

--

Render Network
Render Network

Written by Render Network

Try the leading decentralized GPU computing platform today at: https://rendernetwork.com/

No responses yet