Thursday, March 7, 2013

Making nVidia and Intel Play Nice

By Guest Writer +Mladen Mijatov 


A few months ago, time for upgrading my hardware finally came, and since new
Intel processors come with integrated graphics, I was looking forward to finally
This is the X.Org-Logo cleanly redrawn in SVG.
(Photo credit: Wikipedia)
connecting my TV to my computer. In addition to the Intel graphics card, I also have an nVidia GeForce GT610.

The plan was to connect the TV to my Intel card, and two displays I use for work to nVidia so I can use the more powerful card to the maximum when working, playing
games and viewing movies. This meant Nouveau driver was out of the question and I had to use a proprietary driver solution from nVidia.




If at first you fail, try again


Setting up hardware was pretty much straight forward. The software part, however,
turned out to be more tricky than I originally expected.

For those who are not so familiar with X.Org, it’s a strange beast. Configuration
is flexible enough allowing some pretty complex layouts yet at the
same time sufficiently mysterious that one might easily consume hours if not days trying to
figure out how to get workable settings.

The first attempt was to try and configure two displays as one X.Org screen as
TwinView, and have my flat-panel TV as a second screen. This would start only one display
server with a small drawback of not being able to move windows from one X.Org
screen to another.

My ServerLayout section looked something like this:

Section "ServerLayout" 
Identifier "Default" 
Screen0 "Screen0" 0 0
Screen1 "TV" above "Screen0"
Option "Xinerama" "0" 
EndSection

To my surprise, this configuration didn’t work even though it looked like it should.
It turns out, the problem was with conflicting DRI2 modules. This meant I couldn’t have nVidia and Intel running on same display server. While this outcome is disappointing
it’s not a deal-breaker for me, as I don’t need to have both of my cards working
at the same time. Chances are, I am either going to work or watch TV.

Multi-seat for a single user


Having two display servers is, at this point, inevitable. So, being an Ubuntu user, I went looking for options LightDM supports and how to configure multi-seat without having to butcher up half of my system. As it turns out, LightDM is fairly easy to configure--a refreshing change.

While I was there, I took the opportunity to disable guest login. I don’t need
that and I like keeping things minimal.

I changed my /etc/lightdm/lightdm.conf to look like this:

[SeatDefaults] 
user-session=ubuntu 
greeter-session=unity-greeter 
allow-guest=false
[Seat:0] 
xserver-layout=Default
[Seat:1] 
xserver-layout=TV

Now, in /etc/X11/xorg.conf, I had two ServerLayout sections--each using a
different set of Screen’s connected to a different device. This makes LightDM
start two different display servers, telling each to use the same configuration file,
only with different defining sections.

For a while now, configuring input devices in xorg.conf has not been needed
as the X server itself looks to UDev to automatically configure everything for you.
This is really convenient. Whereas before, you had to manually configure and assign
input devices to a specific display server. Since UDev handles input, two display
servers in my case can share input devices but cannot do so at the same time.

So, at this point I thought everything should be set up and working. One restart
later, I found out that only display server using nVidia card was working properly.
The second display server was being started, but would fail with the following
message:

[20.335] (EE) Failed to initialize GLX extension (Compatible NVIDIA X driver not found)

I thought it was strange how Intel card was trying to load nVidia GLX module.
After quite a bit of time spent reading log files I noticed this line:

[18.240] (==) ModulePath set to "/usr/lib/x86_64-linux-gnu/xorg/extra-modules,/usr/lib/x

Namely modules located in /usr/lib/x86_64-linux-gnu/xorg/extra-modules
are all nVidia specific. Now it’s obvious to me how nVidia is implementing their
driver. They simply point module lookup path to directory with their binary
drivers and in that way override open source modules, effectively stepping on
everybody’s toes.

This is not an issue if you are only using one nVidia card, and additionally
makes drivers easy to remove without leaving your system a complete mess.

Finding BusID for your card


Often, I see people struggling to find the BusID of graphics card when configuring
X.Org. It’s not that hard at all, but it’s not all that obvious either. If you
issue lspci command it will display all PCI devices you have on your computer
(lsusb does the same for USB).

Among other things, this is what lspci returned for my system:

00:02.0 Display controller: Intel Corporation Xeon E3-1200 v2/3rd Gen Core processor Graphic01:00.0 VGA compatible controller: NVIDIA Corporation GF119 [GeForce GT 610] (rev a1)

In my case BusID for Intel card in xorg.conf is “PCI:0:2:0”. Just take first
column of numbers, strip extra zero and replace period with colon.

Eureka effect


The light at the end of the tunnel is finally getting brighter. At this point I had spent
weeks trying to make all of this work. The average individual would have given up by now,
but I like being stubborn. The reward for persevering is always sweet.

So, I configured LightDM to use two xorg.conf files. With Intel in one, I’ve manually
set ModulePath excluding nVidia directory. And, like by magic, following the setting change, after restart I
now hear two login sounds.

In the end, my lightdm.conf file now looks like this:

[SeatDefaults] 
user-session=ubuntu 
greeter-session=unity-greeter 
allow-guest=false
[Seat:0] 
xserver-layout=Default 
xserver-config=xorg.conf.nvidia
[Seat:1] 
xserver-layout=TV 
xserver-config=xorg.conf.intel

And here is the contents of the Intel configuration file:

Section "ServerLayout" 
Identifier "TV"
Screen0 "Screen_TV" 0 0 
Option" Xinerama" "0"
EndSection Section "Files"
ModulePath "/usr/lib/xorg/extra-modules" 
ModulePath "/usr/lib/xorg/modules"
EndSection Section "Monitor"
Identifier "Monitor_TV" 
VendorName "Unknown" 
Option "DPMS"
EndSection Section "Device"
Identifier "Device_Intel" 
Driver "intel"BusID "PCI:0:2:0" 
Option "AccelMethod" "uxa"
EndSection Section "Screen"
Identifier "Screen_TV" 
Device "Device_Intel" 
Monitor "Monitor_TV" 
DefaultDepth 24
SubSection
Depth 24
EndSubSection
EndSection

You can leave the nVidia configuration file the way nvidia-xconfig makes it. It’s also worth noting that Intel drivers in Ubuntu 12.10, which I am using, is affected by this bug #1071530 preventing hardware acceleration from working on Intel GPUs at the moment with new Mesa 9.0.1 library. It’s not a big issue and it’s only matter of time until this confirmed bug will be resolved.

This is how my work environment looks now:

Photo of Mladen's Desktop with Intel and Nvidia Multi-Displays


The first display server is started on tty7, and the second on tty8, which means I can
freely switch between the two using Ctrl + Alt + F7/F8 without logging out
or closing any applications. Both keyboard and mouse get assigned automatically to
the currently active display server.


-- Mladen Mijatov




Enhanced by Zemanta

1 comment:

  1. Getting too cute with the light grey on white.

    I gave up trying to reading this -- it was too hard on the eyes.

    ReplyDelete