Forum Discussion

shakeninsane's avatar
6 years ago

GPU not recognized by game - trying to follow the tutorial

I'm in way over my head. I've upgraded from geforce 650 to geforce 1050 ti. The game says it can't recognize the gpu, so I havent actually tried it yet in fear of frying my mobos gpu. I've only opened the game to check the settings, which reverted back to "crummy" when I did the gpu switch,

Geforce Experience recognizes the card of course, but cant optimize any of my games (but this seems to be a new issue on their side, so idk, because I wasn't alone in this issue.)

I cant seem to tell if my game runs on my mobo's gpu or my new 1050, because I can't find a way to check this. (googling gives me ancient answers that dont fit anymore)

When I run the game, I get the Geforce Experience pop-up, and I can run it from the Geforce Experience panel, which would suggest I'm using the new gpu, but I can't be sure. The game loads fast and looks ok, but I haven't actually play-tested.

I tried following the tutorial, and this is my biggest issue, where you alter the SGR files, and I've gotten to the one where you re-name the r match("${cardName}", "*#####*") but my problem arose already a few steps before this, because I didn't find a geforce model that fit my 1050-series, so I just chose a random "gtx 580" or something and re-named it to my ID.

But if I'm suppose to change the cardname - and input, I cant "rename" the 580 because the title is too short and I'm afraid I'm gonna botch it. I don't understand why we need to rename one that "fits" ?? Can anyone tell me what my ID would be if I want to write
r match("${cardName}", "*Geforce GTX 1050 ti*") Would it be simply "*GTX 1**0*" ?? or?

And this tutorial also seems sort of old, (it still talks about "if your game gets patched" as if Sims 3 has even been patched latetly. Is this tutorial still valid now that you can run the game through Geforce Experience? I don't want to sit and hassle with all this if it's for nothing.
But now I can't test it either until I'm sure I've gotten the right input in hte last SGR file.

I'm so confused! I'm in way over my head here.

Basically, I need help with the SGR files (especially the last one)
And I want to know how to tell if my game is already running through my GPU or not.
  • It sounds like you have the card info sorted out. The point of borrowing the profile of the older card is that you don't have to add the 1050 ti to graphicsrules, because that's the part that can throw people off, and the game doesn't care that its database name doesn't match your card. (Also, not that it matters, but an Nvidia 580 is a much stronger card than a 650, with over twice as good a performance in some benchmarks.)

    The texture memory shouldn't still list <>, although I'm not completely sure it matters. Did you put both the # and a space immediately in front of setb in the graphicsrules entry? It's probably not a big deal though if you can't figure out how to fix it.

    Yes, go ahead and max out your graphics settings. I don't think TS3 sets everything to max for any card, but you can adjust things however you like. A 1050 ti should be able to handle ultra settings if you're playing at a 1920x1080 resolution. The two you may want to dial back are water and high-detail lots, as they're the most demanding and will stress the game engine even if your card is handling its workload just fine.
  • Even if your 1050 ti isn't recognized, it should still be running the game. The easy way to make sure is to open deviceconfig (in Documents\Electronic Arts\The Sims 3) and check what it lists in the graphics card info, starting 30 lines down. That's the card TS3 is using.

    There's no reason to try to force GeForce Experience to optimize your game. In fact, it would probably be counterproductive for TS3, and it would definitely override any changes you tried to make to your graphics settings. Adjust the settings yourself, and you'll be fine.

    There are two approaches to getting a card recognized, and I think your confusion stems from combining the two. The simpler way to do it doesn't even touch graphicsrules.sgr, just graphicscards. The idea is that you let your card borrow the profile of a card that's already in the database. So all you have to do is change the listed device ID (in graphicscards.sgr) of an older card to match your card, and you're done.

    Using the Nvidia 580 is fine; it's already listed as an uber card. Weirdly, graphicscards.sgr lists the device ID differently for the 580 than for the others, but it's not going to matter for what you're trying to do. Just change the bolded text in this line:

    card 1080 "Geforce GTX 580"

    to the device ID of your own card. You can find that ID in deviceconfig, among other places; look for this:

    === Graphics device info ===
    Number: 0
    Name (driver): Radeon Pro 560
    Name (database): AMD Radeon Pro 560 Series
    Vendor: ATI
    Chipset: Vendor: 1002, Device: 67ef, Board: 0179106b, Chipset: 00c0

    Replace the old device ID with yours, and you're good to go. But write it as 0x67ef (that's a zero) just in case it matters to the game.

    The result of this is that TS3 will "think" you have an Nvidia 580, but that doesn't matter. It will be using the profile of the 580 in graphicsrules, so you won't have to add a separate entry for your 1050 ti. You'll know the edit worked when you see a in deviceconfig, as shown above.

    The last thing to do is to check the texture memory (TS3's name for VRAM) listed in deviceconfig. It's a couple lines down from the device ID and should look something like this:

    Texture memory: 1024 MB

    If it says 32 MB <>, you have one more edit to make. In graphicsrules, find this, 8-10 lines down:

    if ($textureMemory == 0)
    seti textureMemory 32
    setb textureMemorySizeOK false

    and change it to this:

    if ($textureMemory == 0)
    seti textureMemory 1024
    # setb textureMemorySizeOK false

    So you're changing the 32 to a 1024 and adding a # and a space in front of setb. This will force the game to recognize your card's dedicated video memory. By the way, a 1050 ti has 4 GB VRAM, not 1, but TS3 can only use 800 MB anyway, so setting it to 1024 MB is fine.

    P.S. The comments about patching are irrelevant here. Every time the game patched, it would revert the changes to graphicscards and graphicsrules, so any edits would have to be redone. But the only ways those files would change now is if you ran the Super Patch for some reason, or maybe if you repaired the game in Origin or Steam.
  • I seemed to have been able to list my card in the GraphicsCard, but not GraphicsRules. The input I made in GR was or match("${cardName}", "*GTX 1??0*") but it didn't take, apparently.

    in my dev. config it now says Found: 1, Matched: 0
    So I probably managed to "borrow" the 580 (but not in the GraphicsRules edit)?

    Anyyyyway, thank you!! I've changed the Texture memory thing and will check it out asap.

    I'm not sure i understand the other part. Should I NOT change the "gtx 580" line, ONLY the ID in front?
    My device ID is already 0x1C82 (a 0) so should I just replace the 580s ID number with my gpus ID number and remove (or just leave) the edit in the GraphicsRules?

    Sorry if I'm slow, lmao, this is all brand new to me. So much for "plug and play" haha.


    edit:

    I edited the number in GraphicsCard, did the textureMemory in GraphicsRules. Did nothing with my "wonky" edit that did not work.

    I now have Found 1, matched 1. Sims thinks its a GTX 580. (which I guess is kinda dumb since i previously had 650 haha), but


    Here's the info

    === Graphics device info ===
    Number: 0
    Name (driver): NVIDIA GeForce GTX 1050 Ti
    Name (database): Geforce GTX 580
    Vendor: NVIDIA
    Chipset: Vendor: 10de, Device: 1c82, Board: 86261043, Chipset: 00a1
    Driver: nvd3dum.dll, Version: 26.21.14.3615, GUID: D7B71E3E-5FC2-11CF-EC7D-28A61BC2D735
    Driver version: 3615
    Monitor: \\.\DISPLAY1
    Texture memory: 1024MB <>

    Does this look correct? Should I be able to amp my settings to high now? When I did "restore to default" now, it didn't go to the lowest, crummiest settings - but not all jumped super high either. But I mean, this card should be able to handle all settings on high, correct? Even if Sims thinks its an older card?
  • Sorry for the late reply! I was bogged down with work and all that real life stuff.

    I copied hte setb thing you wrote and pasted it, so in theory it should have been correct. I will take a look again and see if maybe I missed something.

    As for in game, it seems to be running super smooth with high settings. I think I hav everything maxed out except maybe the two you listed, I don't remember. But upon testing atleast, it seemed to run fine. I mean, the old ran "fine" too, but I didn't have it maxed out - and it wasn't so quick when moving the camera.