You though NVidia was bad, don't try AMD

(14 comments)

Remember when Linus Torvalds lambasted NVidia for not supporting their Optimus technology in their Linux drivers for half a decade and counting? Well, I went out and bought an AMD/ATi video card as mu upgrade. And you know what? Its Linux drivers are far, far worse than NVidia.
1. Most of the games I had working fine on NVidia, do not work on AMD. And those that do suffer far more visual corruption, synchronization bugs (like bottom 40% of the screen rendering half a second after the top 60%), strange visual artifacts (weird triangles popping out of everywhere) and crashes, lots of crashes.
2. There were crashes with NVidia too, but NVidia never managed to crash Compiz along with it or crash the whole X server or lock up the system so far that only SysRq works or even lock up the system so far that only powering it off manually works.
3. And then there is the configuration atrocity. Apparently AMD is too good to store its configuration in /etc/X11/xorg.conf. Or even to document the supported options there. Instead they have their own (also undocumented) configuration file in /etc/ati folder. And it is undocumented because it is a cryptic mess and the only supported way to change it is to use their tools - aticonfig and amdccccle. The command line tool is almost reasonable, except it is also barely documented. For example, one of my screens somehow was always stared at 1920x1080@30Hz. There were 3 different ways to specify default resolution, but none of them used or saved the refresh rate. And when I changed it in the GUI tool - the refresh rate did change, but it was never saved. Oh there nowhere is a save button. It 'just works', except when it doesn't. Like: both of my screens for some reason started with huge black borders around the screen, I finally narrowed it down to the GUI setting "overscan" which defaulted to 10%. Ok, so I change it, it works, but next time I reboot, the overscan is back! I had to find an undocumented invocation of the aticonfig that would change the default value to 0%. Why did this one setting not save? Oh and fun note - the refresh rate of that second screen was correct on the login screen, but it then swiched back as I logged in. Fun, huh?
4. Even at basic desktop tasks fglrx if inferior to not only the free driver, but also to the nvidia driver - even simple scrolling of a large folder in nautilus seems to tax the 200$ card to its limits - the bottom row blinks into place almost half a second after I stop scrolling. Another example - with NVidia when I switch my TV to the HDMI input from the card, the sound starts at the same moment as the picture, however with AMD the sound only decides to show up 10-15 seconds later. And sometime it does not show up at all, unless I start the AMD Control GUI tool and only then the sound shows up 15 seconds later (without doing anything in the GUI).

It may be that one part of AMD is better than NVidia at talking to free driver developers, but another part is so much worse at actual technical work of writing a driver, it is not even funny. They are busy reinventing the bycicle of configuration and display management, while their core driver is just ... not good enough.

TL/DR: Anyone wanna buy a HD 6850 cheaply off my hands?

Crossposted to Google+ - https://plus.google.com/u/0/107099528362923100900/posts/6PC7RFQpm8K

P.S. I also noticed the color difference - with NVidia there was no difference between colors on my LCD TV and my IPS monitor, but with AMD there is a huge difference, the TV color just got washed out. I guess there is no proper color calibration support in the AMD driver?

Update: I have managed to return the HD 6850 to the shop where I got it from (thanks to a nice law requiring web shops to take stuff back within 14 days no questions asked) and got the new NVidia GeForce GTX 660 instead. I had to build an updated NVidia driver (304.48 from this post, just like here or here) but other than that it was smooth and painless and everything is working great again. Only even faster :)

Current rating: 5

Comments

Ralf 4 years, 7 months ago

You forgot one thing: AMD drops support for cards just some 3 years after they were released. Compare that with NVidia, which still supports the GeForce 8 series in my more than five year old desktop PC. Oh, and when there's a new X server, my favourite IT news site usually contains a sentence like "An update for the NVidia proprietary driver is available at , while for the fglrx driver, no update has been released so far". Similar for new OpenGL versions.

However, I wonder why you even had to deal with their config tools? I used an ATI card for two years, and never ever launched a single ATI specific tool. xrandr did all I wanted (and fglrx supports xrandr for some years already, unlike the NVidia driver).
When you are using a Notebook though, the missing Optimus support means that you can't use the NVidia card *at all*, which is still far worse than anything ATI forced me to do so far. I am using Bumblebee now, which is an awesome project, but the performance hit is something like 50%... On the other hand, having the NVidia card running on this 2nd X server only means that I can have both KMS (on the Intel card) and he proprietary NVidia driver, which is pretty awesome :D

Link | Reply

Alice Cooper 4 years, 7 months ago

Should be "Its Linux drivers", not "It's Linux drivers"...

Link | Reply

Jan Hudec 4 years, 7 months ago

Have you tried the AMD-provided driver (fglrx) or the mesa ones (r300)? My experience is that the fglrx one sucks big time (does not understand half of OpenGL), but the r300 finally got usable some one or two years ago. It's still slower, but I suspect fglrx is faster only because it does not do some of the things it is told to do.

Link | Reply

Oliver Kraitschy 4 years, 7 months ago

I have a different experience with the fglrx driver. I don't have crashes or visual corruption or performance issues and the games i tried do work fine. Both on Debian and Arch Linux.

Also, what you said about the configuration is not true. fglrx doubtless uses /etc/X11/xorg.conf and you can use the tool aticonfig to create or modify a xorg.conf file.
Also you get a whole bunch of available options with aticonfig --help, so saying that it is barely documented is kind of exaggerated ;-)

Link | Reply

aigarius 4 years, 7 months ago

Oliver, let me be a bit more specific then. AMD developers have stated that they treat xorg.conf as "the configuration of the X server" and /etc/ati/... as the configuration for their driver. All of xorg.conf gets treated as a "hint". While it appears that aticonf creates an xorg.conf file, in fact what it creates is a virtual xorg.conf file, but the real, effective configuration is in /etc/ati and that is the configuration that matters.

As for documentation: compare the output of "aticonfig --help" to ftp://download.nvidia.com/XFree86/Linux-x86/1.0-7174/README.txt , especially section 3 and appendixes from D onwards. As an example: it is possible to set a default overscan value with aticonfing. Can you find that in "aticonfig --help"? No, because you need to use the "--set-pcs-str" option with the correct (undocumented) parameters to do that, oh and you can not find them in that file too, because it does not appear in the file before it is set. Also, apparently you can set display size with aticonfig, but not refresh rate. Sure, you can set it with xrandr, but then it is not saved in /etc/ati/... and thus switches back when you restart X. Oh and don't try doing any changes to /etc/ati while the X server is running - fglrx will happily overwrite any of your changes on exit.

It is madness!

Oh and I was trying to get fglrx to work, because I bought a $200 video card with the explicit purpose of playing some semi-modern video games. The free driver is fantastic in 2D and in running Unity, but for real games, it is, unfortunately, way too slow and lacks most OpenGL features.

Link | Reply

aigarius 4 years, 7 months ago

Let me also describe some issues that I had with AMD and fglrx, that did not exist on NVidia. There are a few big games, that work well - the big 3 from Blizzard work fine, Civ5 works fine, games from Valve on the Source engine work okish:
* they break horribly if you turn on "multicore rendering" option
* when they decide to switch to a lower resolution you only get the game picture in a corner of the screen and your mouse position gets de-synced from clicking position so you have to guess where your mouse should be to click trough to video setting and switch the resolution back to native
* there is this weird video sync line at around 60% screen height and everything below that line is delayed be a fraction of a second and UI elements around that area get distorted or fully disappear (such as ammo and health pickups in HL2)

I could grit my teeth and live with that, but then there are games like Bioshock, which worked perfectly fine with NVidia, but with AMD and fglrx there is such heavy corruption with random triangles dancing around the screen, all models distorted, all effects causing even more glitch chaos, every tenth gun shot causing a full on crash, it is unplayable and no settings changes do anything to fix that.

The most common feature of other games is a hard lockup at random times. Not fun.

Link | Reply

Francis 4 years, 7 months ago

Huh. My experience with ATI/AMD cards is:

- Insert Card
- Boot up
- Don't even think about fglrx
- FPS is just fine, even with a GL-based compositor.
- Further improvements to be had by downloading and compiling newer versions of libdrm and mesa

nVidia has never let me do that. Their drivers do, however, turn computers into toaster ovens.

There are a few caveats with getting it to work with wine (most win applications are still 32 bit it seems), but that's more to do with game developers being teh suck. I hear Source is coming to linux :)

Link | Reply

ChrisK 4 years, 7 months ago

Aigarius: I want to thank you for posting your experience with an AMD card, as I'm always curious to know if things have gotten better with ATi/AMD.

I've been having some interesting issues with the Nvidia proprietary driver on my T61p laptop that has a Quadro FX 570M card in it. With the 304 drivers, on the laptop's screen only the resolution 1600x1050 is available. This is annoying because the external monitor I have can't use that resolution, so I cannot use TwinView. I probably need to switch to using Nouveau and see how that goes.

But either way I'm staying away from AMD. ;-) Thanks.

Link | Reply

aigarius 4 years, 7 months ago

You can use all kinds of configurations with TwinView nowadays, including side-by-side screens with different resolution or even overlapping screens.

Link | Reply

aigarius 4 years, 7 months ago

Fixed. Also testing threaded comments :)

Link | Reply

ChrisK 4 years, 7 months ago

Overlapping screens via TwinView seems not to be possible for me right now. :-/ Perhaps it's possible to do via "Separate X screens".

Link | Reply

aigarius 4 years, 7 months ago

Here is what I got when trying that right now - http://www.zimagez.com/zimage/screenshot-210912-185403.php and it worked just fine.

Link | Reply

ChrisK 4 years, 7 months ago

I get this:
ftp://ftp.coredump.us/Screenshots/NVIDIA_TwinView_screenshot.png
after which none of the settings seem to apply. I'll try to investigate further.

Link | Reply

aigarius 4 years, 7 months ago

Try installing the newest NVidia driver, turn Advanced options on, see if there is something, on that screen in the Selection combo box, try picking the X server line and experimenting with MetaModes (advanced). Are you running this as root? You can also try setting a setup there, then saving it to xorg.conf via the button and restarting. It could be that some TvinView or xrandr part is not loaded in you current configuration.

Link | Reply

New Comment

required

required (not published)

optional

Recent Posts

Archive

2016
2015
2014
2013
2012
2011
2010
2009
2008
2007
2006
2005

Categories

Authors

Feeds

RSS / Atom