How to make Sims 3 recognise graphics card?
I know this is a common topic and there's already tons of guides about it but I've tried a few times and can't get it to work. I was advised to come here and that puzzleaddict or someone else here would probably be able to help me so here I am.
I have an Intel(R) UHD Graphics 630 integrated card and an Nvidia GeForce GTX 1060 6GB. I tried using the RTX_Series.zip file here https://www.carls-sims-4-guide.com/forum/index.php?topic=26753.msg446248#msg446248 and following the instructions there in overwriting my GraphicsCards.sgr and GraphicsRules.sgr files as I was told this had my Nvidia card added but my deviceconfig file still says Found: 0, Matched: 0.
I think that part of the problem is that I do have the two cards and depending on which of two slots on the tower have the monitor cable plugged into I get different menus on the desktop and the Nvidia control panel.
If I right click on desktop with the cable plugged into the upper slot in the tower, I get (file on left) and if I then click into the Nvidia Control panel I get (file on right, sorry, don't know why its so fuzzy).
If I right click on desktop with the cable plugged into the lower slot in the tower, I get (file on right) and if I then click on the Nvidia Control panel I get (file on left).
And this is the deviceconfig as it stands.
This is why I think its part of the problem. Don't know if I'm right, I'm really out of my depth with this technical stuff.
Can you guys help?
@thekaratekitten First of all, you should always have your monitor plugged directly into the graphics card unless you're trying to troubleshoot something that requires using the motherboard slot. This is the graphics card (I'm borrowing the diagram from someone else):
For the graphics card recognition, the issue is probably that the file you've downloaded doesn't contain your card's device ID. But you can add your card to the .sgr files manually instead.
Note: These instructions only apply to and will only work for the original poster's graphics card. If you'd like help getting your own card recognized, please post the same information from deviceconfig and ask for help.
SpoilerOpen graphicscards.sgr (Notepad works fine), and crtl-F to search for 10de. That will take you to these lines:
vendor "NVIDIA" 0x10b4 0x12d2 0x10de
card 0x0fd1 "GeForce GT 650M"
card 0x0fd2 "GeForce GT 640M"Create a new line under the "Nvidia" line, copy this text, and paste it in the new line:
card 0x1c03 "GeForce GTX 1060"
So you should now see this:
vendor "NVIDIA" 0x10b4 0x12d2 0x10de
card 0x1c03 "GeForce GTX 1060"
card 0x0fd1 "GeForce GT 650M"
card 0x0fd2 "GeForce GT 640M"(with indents from spaces that this site isn't displaying properly). Save, quit, and open graphicsrules.sgr. Crtl-F and search for 8800, which will take you here:
elseif (match("${cardName}", "*8800*") or match("${cardName}", "*9500*") or match("${cardName}", "*9600 GSO*") or match("${cardName}", [etc.]
Change the bolded 8800 to GTX 1060. Don't change anything else, not even the asterisks. This will classify your card as uber.
Finally, scroll back to the top of graphicsrules, and look for this, 8-10 lines down:
if ($textureMemory == 0)
seti textureMemory 32
setb textureMemorySizeOK falsechange the 32 to 1024, and add a # and a space in front of setb. Your card has three times the video memory, but TS3 can only use 800 MB anyway. The lines should look like this:
if ($textureMemory == 0)
seti textureMemory 1024
# setb textureMemorySizeOK falseYou'll know it worked if you see a [Found: 1, Matched: 1] next to the card name in deviceconfig, and texture memory listed as 1024 instead of the current 32 MB override. If either edit doesn't work, please paste that same section of deviceconfig here, although you can copy and paste the text instead of taking a screenshot.