08-01-2012 07:38 AM
I have successfully managed to fit an Asus GTX560Ti CUII 1gb DDR5 card into the tiny innards of my E30.
I also have upgraded to PSU to a 650w ACE 'Black-edition' A-650BR. Things are running nicely, for now anyway.
I use the GTX560Ti GPU for iRay CUDA rendering in 3DS Max 2013.
Can I run the monitor off the Intel HD P3000 integrated chip, while still having the GTX560Ti recognized by the E30 as a secondary card?
I tried switching the BIOS setting for 'Video' from 'AUTO' to 'IGD' or whatever the abbreviation for the Internal Graphics was. Then I plugged the monitor via VGA into the VGA motherboard connector on the rear. It booted up fine, but when I ran 3DS Max 2013 it couldn't find the GTX560Ti card inside.
I can run the BIOS 'Video' setting as 'AUTO' and the GTX560 card runs fine as a primary card. I just want to run it as a secondary card, so as to free up the cards memory for iRay rendering instead of being a Windows Display Adapter.
Any help or ideas?
Solved! Go to Solution.
08-02-2012 04:43 AM - edited 08-02-2012 04:50 AM
from what i recall that's acting as intended. you'll want the BIOS GPU setting on 'auto' and will likely have to use the GTX560Ti as your primary device.
even though 3dsmax isn't reporting the GTX, are your rendering times dropping with the card installed and the IGP as your primary?
edit: if that doesn't work then you might consider a 2GB version of this card instead. that would provide more room to drive a display and render. plenty of 2GB GTX560Ti models exist and should be in the same price range.
08-02-2012 06:21 AM - edited 08-02-2012 06:24 AM
Well with the IGP as primary; upon booting to desktop the two resident startup Asus graphic apps (SmartDoctor and GPUTweak) both reported the card missing. I proceeded to start 3DS Max and entered the iRay dialogue box, scrolled down and the place where the GPU was usually selectable was now blank. I presumed, and everything pointed to, the card not being initiated in the PC.
Anyway, I seem to have solved the 'out of memory' problems by tweaking some settings in Max;
/render settings/bitmap performance and memory options/bitmap proxies on and render systems/render with proxies.
(This setting clips and reduced textures not visible within the boundaries of the object they are applied to so it loads less bitmaps into the cards memory)
I also used a third party app called PolygonCrusher to reduce the number of polygons and thus faces to load into memory too. That dropped the polygons by around 50% without any major artifacts.
Everything works fine now, and on second thoughts I also like the GUI boost I get in apps from using the GTX560 as primary. Maybe when the departments budget is reset next year I'll put in an order for a 2gb card, but for now I think I've sorted it to a performance level that's workable.
Thanks for your help Erik
03-20-2013 08:55 AM - edited 03-20-2013 09:03 AM
Any new tips on how to get this working? My 3D models are getting more detailed and subsequently not fiting into the 1gb GPU memory (which idles at around 600mb filled simply by loading a 3DSMax scene).
Someone must have figured this out... anyone?
I am getting a 2GB card in a few weeks, but would like to figure this out regardless.
p.s. that first PSU our IT team bought for me blew up. After Googling it; I found out it was made in Sweden, where they have since banned it for use in their own country?!?! I have a nice Corsair Enthusiast 650w now. Working fine.