Gaming on my NVIDIA Tesla GPUs - Part 1 - NVIDIA Maxwell

72,801
0
Published 2024-05-25
Grab yourself an Insulated Coffee Tumbler at craftcomputing.store/

Support me on Patreon and get access to my exclusive Discord server. Chat with myself and the other hosts on Talking Heads all week long.
www.patreon.com/CraftComputing

Nearly four years ago, I started looking at VDI (Virtual Desktop Infrastructure) through Nvidia's GRID tools. It's a system that allows your to bifurcate a GPU, essentially running multiple 3D Accelerated Desktops from a single graphics card. While most official uses revolve around Desktop and Office environments, I've always wanted to use it to play games. And thus, the Cloud Gaming Server Project was born!

As such, I've collected a HUGE number of Nvidia and AMD Enterprise GPUs, and over the next few videos, I'm going to be benchmarking them all. Head to head, which GPU is worth your money, and which ones should you leave on the eWaste pile.

-- vGPU Installation --
Proxmox vGPU Installation Script: github.com/wvthoog/proxmox-vgpu-installer
PolloLoco's vGPU Installation Guide: gitlab.com/polloloco/vgpu-proxmox
Manual vGPU Install Tutorial:    • Proxmox GPU Virtualization Tutorial w...  

-- vGPU Memory and Frame Limiter Unlocks --
(Under 'Define GPU Profiles')
drive.google.com/file/d/1Ok1dZkxI_yk_WLYSICi4bmMix…

Links to items below may be affiliate links for which I may be compensated

-- GPU Server --

Asus ESC4000 G3 GPU Server: ebay.us/ngCIDO or ebay.us/hAThwr
Asus ESC4000 G3 Sliding Rails: www.neobits.com/asus_tool_less_rail_kit_tool_less_…
Intel Xeon E5-2697 v4 18-Core: ebay.us/6nlzsD
8x32GB DDR4-2400 REG-ECC Memory: ebay.us/4DNJ00
1.92TB Patriot Burst Elite SATA SSD: amzn.to/4b8Q9jc

-- GPUs --

Nvidia Tesla M40: ebay.us/824gH4
Nvidia Tesla M60: ebay.us/NNS4Ql

All Comments (21)
  • @WillFuI
    I’m excited for the P version of this video as I feel that is the best spot right now
  • @KomradeMikhail
    9:47 "I think it's finally time to say goodbye to Kepler, altogether." Considering it's been almost 3 years since Kepler support was discontinued, likely Maxwell will be dropped any day now too. I half expected Jeff to stuff this thing with Pascal or higher. I just sold off a couple Maxwell based Quadro's before I expect their value to take a dip.
  • @DustinShort
    Really looking Forward to the Pascal results. I have a p40 I haven't been able to test yet but am hoping I can make a low cost VDI server for small and low demand CAD clients.
  • @edgecrush3r
    Happy to see the Tesla vGPU series is back! It has been my fav series in the past. This got me into so many discoveries in the past. It would be nice to see more detailed installation instructions as setup can be challenging
  • @blakecasimir
    Aside from gaming, Tesla P40s make for a cheap, though slow, but still useable alternative for hosting LLMs. 24GB VRAM helps.
  • @dectoasd3644
    Look forward to the P40 test as I have a couple of these in a X99 desktop board with E5-2597 V4. Mostly been playing with AI inference but spinning up some local cloud for older games seems like it could be interesting.
  • @ytdlgandalf
    Good experiment setup. Love these proper benchmarks
  • @AlexKidd4Fun
    I've been looking forward to this series! Thanks for the great content, Jeff!
  • @xerox9426
    By default, the Frame Rate Limiter (FRL) is enabled for all GPUs (60FPS). The FRL is disabled when the vGPU scheduling behavior is changed from the default best-effort scheduler on GPUs that support alternative vGPU schedulers. The Tesla M10 is actually stronger than the GRID M40 by 20%-25%.
  • @5ub5pace
    16:40 Helldivers 2 is quite CPU heavy. It's not an apples-to-apples comparison: you're probably seeing the more modern 7840U cores pull ahead of the Broadwell cores, despite the Tesla GPUs being faster than 780M in GPU-bound scenarios.
  • @computersales
    I love my M40 even though I don't use it much anymore. I don't understand why the Pascal stuff hasn't really dropped much in price though. It seems like it's holding on for dear life.
  • @Feynt
    Good run through. Looking forward to the Pascal tests.
  • @joseph3164
    Thanks for the video! Looking forward to the pascal results. Tesla P4 is a powerhouse of a LP gpu
  • @NdMoreSpd1.0
    For starters, love the testing it's good to have an idea where different cards land should anyone dive into this abyss. Interested in seeing a breakdown of sorts not only on how this compares to building/buying/supporting multiple desktop machines versus a single rack mounted server with VMs. As you mentioned towards the end there, power consumption against performance also plays a big part for some folks so trying to at least mention efficiency would be good to see as well. Most interested in changes across your setups from 8 VMs dropping down to maybe just 4 VMs or even 2 VMs on the simpler cards. What numbers might you see if you ran say two of the M40s for example with just two VMs running? But again, thanks for helping me consider something that been in mind for years, just not in patience, could I get the kids gaming with me without having to break the bank...
  • @almc8445
    Awesome video! I love the idea of a virtual gaming PC! On another note, can you do a super quick video breaking down the naming convention for Nvidia/AMD server cards? Specs aren’t usually too hard to find, but broad descriptions of series’s and product lines are extremely rare
  • @annihilatorg
    I would appreciate it if you added some highlights to your data grids when you call out specific results.
  • @cxmxron_7964
    Noticing that most of these old compute cards are cheap due to new AI services not supporting the older cuda version. Triton server for example supports cuda 6 and newer (going off memory here), and the tesla m40 24gb is cuda 5.2 . Did learn this the hard way too since nvidia docs weren’t clear, so wasted money on a card
  • @obraik
    Have you looked at Hyper-V and it’s built in GPU partitioning that also works with consumer cards? I use it in my home server for various VMs running Plex, Frigate and one or two other video decoding/encoding activities. No special mods or drivers required and it works with both Windows and Linux VMs
  • @tobiahhowell
    I'm actually doing something similar but a bit less technical. I plan on upgrading my powerful gaming PC so it can handle any game I want to play, but I have a pretty small office space and I don't want to play games in a sauna... so I'm going to put my gaming PC in my garage and use steam remote play to display the games off a smaller PC actually in my office. Your method seems to be quite a bit more compact than the 4U server chasis I found for my computer 😅. Also, during my testing to make sure my setup would work, I found that Steam Link (an older version of Steam remote play that you can install on Raspberry Pis) doesn't like AMD graphics cards. It works just fine with NVIDIA cards (I don't have any Intel cards to test) but AMD cards do not work well😅 I think it's an encoding issue. Awesome video by the way! Keep up the awesome work!
  • @Wesrl
    With these being so cheap I am wondering if just using them for transcoding would be useful.