I'm just speechless at this point. I always thought I know a bit about pc's, and people usually went to me first when they wanted to have their own fixed.
When I bought my gaming laptop 4 years ago, it wasn't absolutely topnotch, but a rather good setup. Costum built, ~2000$ XMG laptop, i7 3630QM, Geforce GTX 660M etc.
Mediocre today, but quite good a few years ago.
Now I always wondered why I could play many games only on low setting, if at all. Like Witcher 3 was basically a diashow, gta 5 only on the lowest settings etc and today I bought Skyrim SE.
I used to play the original one but since it was a bit laggy I just played the main story and quit.
Now I start the game, and set everything to super low... and 10FPS.
Dafuck I think, look up the requirements:
>CPU: Intel Core i5-750/AMD Phenom II X4-945
CPU Speed: Info
RAM: 8 GB
OS: Windows 7/8.1/10 (64-bit Version)
Video Card: NVIDIA GeForce GTX 470 1GB /AMD Radeon HD 7870 2GB
Comon, thats way worse than what my old gaming buddy has.
I always had a thought that my laptop would use its integrated graphics, but why would it (Intel HD Graphics 4000)? I check the Nvidea control center and it says 0 supported games.
After updating, it found a few games, but still, laggy as hell.
Now at this point I discovered the Nvidea control panel, or rather clicked around it - and then I saw that for whatever reason,
almost all my games with the exception of League Of Legends and Stardew Valley where using my integrated graphics card instead of my Geforce.
For four years I played games with my integrated graphics card.
I just can't get over my stupidity at this point. Btw, Skyrim SE runs now at a mix of Mid/High settings @ 60FPS...