08-12-2011 05:41 PM
I need help. I'll keep it simple:
I have an NVIDIA Quadro 1000m card on my Lenovo w520. However, dxdiag and the ThinkVantage Toolbox both give me a summary of my system that says the current video card is Intel HD Graphics Card Family. The Quadro is enabled in the bios, and the up to date drivers are all installed. Device manager reads that both the Intel HD Graphics Card Family and the Quadro are installed and are "working properly." If I disable the Intel card, the screen goes black and I need to force-restart in safe mode to enable it again.
My question is, which card is actually working? And how can I get the Quadro to run as opposed to the Intel (so that dxdiag and the ThinkVantage Toolbox accurately read that the Quadro is my video card)?
Ps. I've called tech support 4 times..they're no help... and thus I turn to the interwebs for assistance. Your help would be tremendous.
Solved! Go to Solution.
08-12-2011 10:25 PM
08-12-2011 10:46 PM
You have an "Optimus" configuration, this is what switchable graphics has morphed into. The machine will only run the 1000m when it is necessary to run cooler, run quieter, and use less battery power. The machine decides when to turn on the card. When the nVidia card is off, it uses no power. The latest nVidia drivers should have a taskbar utility that tells you when the 1000m is active. You can override this in the BIOS if you need to for some reason.
08-13-2011 07:32 AM
Right-click on Desktop --> NVIDIA Control Panel --> Desktop (Menu) --> Display GPU Activity Icon in Notification Area
(assuming You're running Windows)
If the Icon on the Notification Area is colorful, then it is running the Nvidia GPU. If You click on it, it shows You what Application is using the dedicated GPU.
08-13-2011 11:15 AM - edited 08-13-2011 11:20 AM
Actually there is more to this story. Don't forget about the display being used.
How things run also depends on the attached displays if any. In Optimus mode if you have a VGA attached display, you will notice the Intel integrated GPU is used. This also happens for the laptop LCD.
If however you have a display attached via DVI or DisplayPort you'll notice the NVIDIA 1000M or 2000M will be used.
Also, the notion that the Intel integrated GPU runs cooler than the NVIDIA GPU isn't totally accurate. My W520 is set to discrete all the time and I monitor the fan speed and heat using HWiNFO64. At idle my machine CPU is usually 47C or lower. The NVIDIA 2000M is usually at 42C. Fan RPM averages 1980.
08-13-2011 02:34 PM
I partly agree with You. But in my case, connecting an external Display via VGA in Optimus mode also used the NVIDIA GPU (1000M).
When using two external Monitors with VGA and Display-Port-to-HDMI the NVIDIA Control Panel indicates that both Ports are coming from the dGPU. In this case the NVIDIA GPU is also in 3D-Mode which sets the clock to 1400MHz --> hotter.
Does anyone know a good tool(light, fast to use, non obtrusive) to manually set the card in 2D-Mode?
08-13-2011 04:45 PM
08-13-2011 07:01 PM
Oh no, heat is not a problem, I guess.
It just bothers me that the clock of the dGPU rises to 3D-mode, even though im doing light Office/Browsing-Work. I'd expect more from a Technology like Optimus, since it's meaning was to use power as it is needed.
My CPU is at +-50°C. Two external 1080p Monitors (VGA and Displayport), BIOS 1.26 with default BIOS Fan Control (no TPFC)