So I ordered my new system (in bits) from Lambdatek over the weekend and it all arrived on Tuesday which, considering it came in four batches via four different carriers is pretty good.
It's an Asus P5Q-VM motherboard, Intel quad core processor and 8Gb of memory with all other bits 'generic'. Ubuntu 8.10 beta installed without any problem at all, which is a good start. However.....
I sort of knew the Intel G45 graphics was probably a bit risky being so new, and I was right. To get it to run without X crashing I have to add 'Option "NoAccel"' to the device section of xorg.conf. It's a know problem and there is a fix out there which will appear in the Intel X drivers in due course. I don't really need the acceleration at the moment so I can wait.
However there is another minor problem, I can't get as high a display resolution as I want, the maximum I'm offered is 1152x864 and I normally run 1600x1200. The display is actually capable of more than that I think.
Currently I'm running the new (Ubuntu 8.10) system on the VGA input of my monitor and the old (Fedora 8) system on the DVI-D input of the monitor. I did try swapping them round but the the Ubuntu 8.10 system just failed to start video at all, I had to ssh into it to shut it down.
It looks as if the display/screen detection in the Intel X drivers is completely broken. The display is a Dell 2001Fp and is detected without any problems by the Fedora 8 system. How can I tell the Ubuntu systems X configuration all about my display without too much hassle? Is there any utility which will interrogate the display and spew out xorg.conf configuration parameters?
On 08/10/2008 10:57:47, Chris G wrote:
However there is another minor problem, I can't get as high a display resolution as I want, the maximum I'm offered is 1152x864 and I normally run 1600x1200. The display is actually capable of more than that I think.
I have Debian on my desktop and I recently had a similar problem. After an upgrade of X it chose a default resolution that was substaintially poorer than the monitor can cope with.
What I think happenned in my case is that X switched from using the values in the x.org config file by default to using the values obtained from the monitor and then applied an incorrect or too simplistic algorithm to chosing the best mode.
The solution in my case was to add an option to the x.org config file: "PreferredMode". This makes the whole of the monitor portion of my x.org config file as follows:
Section "Monitor" Identifier "CPD-G500" Option "DPMS" ModeLine "1600x1200" 229.50 1600 1668 1860 2152 1200 1201 1 204 1250 +hsync +vsync Option "PreferredMode" "1600x1200" EndSection
The ModeLine above is not wrapped in my config file. The timing in this line are those reported by the monitor as seen by examining the X :0.log file.
HTH, Steve.
Hi
2008/10/8 Steve Fosdick lists@pelvoux.nildram.co.uk:
On 08/10/2008 10:57:47, Chris G wrote:
The solution in my case was to add an option to the x.org config file: "PreferredMode". This makes the whole of the monitor portion of my x.org config file as follows:
Section "Monitor" Identifier "CPD-G500" Option "DPMS" ModeLine "1600x1200" 229.50 1600 1668 1860 2152 1200 1201 1 204 1250 +hsync +vsync Option "PreferredMode" "1600x1200" EndSection
Wow. Not seen a ModeLine entry since the XFree86 3.x days. Out of curiosity, why is it needed (on modern Xorg servers)?
I recently tried starting Xorg with no xorg.conf file, and it worked first time, with the correct resolution. Only had to create xorg.conf to add UK keyboard map... and change from the i810 driver to the intel driver...
Odd, I also cannot find PreferredMode in xorg.conf man page... Are you running some kind of a custom X server?
I set the 'prefered mode' in the Display subsection of the Screen section... do people do it another way?
Modes "1600x1200" "1280x1024" "1024x768" "800x600" "640x480"
The first resolution seems to become the default.
- Srdjan
2008/10/8 Srdjan Todorovic todorovic.s@googlemail.com:
Wow. Not seen a ModeLine entry since the XFree86 3.x days. Out of curiosity, why is it needed (on modern Xorg servers)?
Forget that question. A better set of questions are why did Xorg choose bad EDID data? Which version of Xorg, and should we all expect to see similar problems?
- Srdjan
On Wed, Oct 08, 2008 at 01:15:19PM +0100, Steve Fosdick wrote:
On 08/10/2008 10:57:47, Chris G wrote:
However there is another minor problem, I can't get as high a display resolution as I want, the maximum I'm offered is 1152x864 and I normally run 1600x1200. The display is actually capable of more than that I think.
I have Debian on my desktop and I recently had a similar problem. After an upgrade of X it chose a default resolution that was substaintially poorer than the monitor can cope with.
What I think happenned in my case is that X switched from using the values in the x.org config file by default to using the values obtained from the monitor and then applied an incorrect or too simplistic algorithm to chosing the best mode.
The solution in my case was to add an option to the x.org config file: "PreferredMode". This makes the whole of the monitor portion of my x.org config file as follows:
Section "Monitor" Identifier "CPD-G500" Option "DPMS" ModeLine "1600x1200" 229.50 1600 1668 1860 2152 1200 1201 1 204 1250 +hsync +vsync Option "PreferredMode" "1600x1200" EndSection
The ModeLine above is not wrapped in my config file. The timing in this line are those reported by the monitor as seen by examining the X :0.log file.
In the end it was easier than that. I was confused by the Gnome Screen Resolution applet. When I turned off Mirror Screens (whatever that means) I was able to select other refresh rates in addition to the 60Hz that was originally there. Selecting 75Hz gave me 1600x1200 resolution, when I selected it the refresh rate was back to 60Hz but at least it worked!
That applet is decidely odd! The (presumably older) version I have on my Fedora 8 system is much more straightforward.