@VirtualVirgin said:
I'm guessing it's the NVIDIA RTX 4090, but would like to hear about it.
Are there benchmarks available? Some kind of test which show how much processing is offloaded with different models? Are there latency benchmarks as well? Do the higher-end graphics cards yield lower RTL? If so, how by how much?
Hi @VirtualVirgin, the answer is quite simple: as much as you can afford 😊
Regarding accurate testing, I think any modern desktop-grade GPU would handle any workload you practically need. If we speak about desktop-grade GPUs to recommend, you can buy RTX 4060 Ti with 16 gigs of memory for $450-$500 on Amazon U.S...
I can imagine 16 gigs will work for 99.9% of the cases, a few hundred instruments, multichannel... of MIR Pro 3D
Once new products are supported (hopefully, soon) you will still have this decent amount of memory enough to empower the largest session you ever created.
For laptop-grade GPUs I would not recommend going down further than RTX 4080, which is most often packed with 12 gigs. It costs around $2300-$2500 on Amazon U.S.
The same goes for AMD equivalents.
Macs are different but I would not ever recommend you going below 16 gigs. I use my M2 Air packed with 24 gigs, and it's enough for everything, yet it's powerful enough to do anything I want (that's probably why I don't want to go with Macbook Pros regardless I have a budget).
I hope that helps!