Intel HD Graphics 4000 is used instead of nVidia Geforce 630M for old game?
Solution 1:
I authored a question on this subject a few years ago, so I might as well chime in with what I know.
Your laptop uses a technology called Nvidia Optimus to render video output from two GPUs (the integrated Intel graphics processor, [IGP], and the more powerful Nvidia graphics card [DGPU]). This is accomplished by connecting the laptop's screen to the framebuffer of the IGP only, and allowing the DGPU to write pages of memory directly into that framebuffer. In this way, both cards can render output to the same screen, even simultaneously. When an application calls for DGPU rendering, the DGPU writes output to the portion of the screen that the application occupies. In the case of a full screen application such as a game, the DGPU will write to the entire framebuffer of the IGP. A much more detailed description of this process is available in the Nvidia Optimus whitepaper.
When running a graphics-heavy application such as a game on an optimus-enabled machine and experiencing poor performance, it is logical to start by ensuring that the application is making use of the DGPU rather than the IGP. You can do this via the context menu entry you showed, or, somewhat more reliably, through the NVidia control panel. Simply select "Manage 3D settings" from the pane on the left, select your application, then set the "Preferred graphics processor" to the Nvidia chipset.
You can ensure that the application is running on the Nvidia GPU by using the Optimus Test Viewer. This tool will indicate whether or not the DGPU is enabled, and can list which processes are making use of it.
A final workaround for optimus-related issues exists in the hardware outputs of the video card. The Nvidia control panel, as in your screenshot, can display which physical outputs are connected to which monitors. From your screenshot, it appears that the Nvidia GPU has one physical output - You can try plugging an external monitor into this output and confirming that it appears connected correctly in the Nvidia control panel. If so, your montior is now hooked directly to the framebuffer of the DGPU, meaning that optimus is not in use, and all rendering on that monitor will take place on the DGPU.
Based on the discussion in the comments on your question, you have done the following:
- Forced use of the DGPU for your game through the Nvidia control panel
- Verified through use of the Optimus Test Viewer that the game is using the DGPU
- Connected a monitor to the DGPU's hardware output and run the game on that monitor
And despite all of this, the game still runs very poorly. I can only conclude from this information that the problem is not optimus related, but is some other problem - possibly a compatibility issue arisen from such an old game, or from some property of the configuration of your new laptop. You have mentioned that this game is open-source - if there is an active development community, they may be the next best bet for finding a resolution to this problem.
Solution 2:
The game Dark Reign 2 dates from June 30, 2000.
As such, it does not use the latest GPU Streaming SIMD Extensions.
This might explain why it cannot use a modern GPU such as yours.