So as most people know, when you use RDP to connect to your desktop, it disables the graphics card and uses generic CUDA.

I don't want Windows to revert to using CUDA instead of the Graphics Card. I have a GTX 780ti in the computer but it isn't being used by RDP. Is there any way to force Windows to use the hardware graphics card?

I've tried TightVNC, RealVNC and LogMeIn, but I want to use RDP as it is the fastest and works best for me.


Solution 1:

Firstly, you are getting your terms mixed up. CUDA is an NVIDIA technology for programming their GPU (and other things, but that's the simplest description).

Microsoft's RDP uses a it's own graphics driver which converts the rendered screen into network packets to send to the client.

This is the core of how RDP works and you cannot change it.

On the server, RDP uses its own video driver to render display output by constructing the rendering information into network packets by using RDP protocol and sending them over the network to the client. On the client, RDP receives rendering data and interprets the packets into corresponding Microsoft Windows graphics device interface (GDI) API calls.

Source: http://msdn.microsoft.com/en-us/library/aa383015(v=vs.85).aspx

Solution 2:

Everything in the above answer is correct except for "This is the core of how RDP works and you cannot change it". Never say never.

There are two ways to utilize a better graphics driver over RDP without 3rd party slow laggy software and without modifying any windows DLLs.

  1. (super hard) Install windows server 2012 r2 on a physical host. Then use Hyper V to create a virtual desktop environment and install your OS as one of those virtual desktops. Install and configure the server roles for Remote Desktop services. Then you will be able to add a virtualized GPU to your virtual machines running on the server. When you RDP to those machines they will use RemoteFX. RemoteFX is capable of 3d rendering and DX11.

  2. (medium hard) Install windows server 2008 r2 on a physical host. Install the server role for remote desktop services. With this installed there is a registry setting that will allow you to pass your physical GPU rendering on to RDP users. There is also one that lets you use the vGPU called RemoteFX if you want. Yes, you can even run a server with no physical GPU. This method ONLY works on windows server 2008 R2.

RDP stands for Remote Desktop PROTOCOL. It is simply a step by step procedure on how to break down the image, sound, and control variables into network packet frames to send. RDP has nothing to do with the rendering or hardware acceleration. If you look at your event viewer right after you "RDP" into a machine, you can find where windows originally loads the graphics drivers for your local machine, then immediately after, disables those and loads the default terrible diver.