I've taken a look at my Xorg.0.log file and, among other things, it has:-
(--) GLINT(0): FIFO Size is 256 DWORDS (==) GLINT(0): Min pixel clock is 16 MHz (--) GLINT(0): Max pixel clock is 110 MHz (II) GLINT(0): LGFlatron: Using hsync range of 30.00-107.00 kHz (II) GLINT(0): LGFlatron: Using vrefresh range of 50.00-200.00 Hz (II) GLINT(0): Clock range: 16.25 to 110.00 MHz (II) GLINT(0): Not using default mode "320x175" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "320x200" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "360x200" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "320x240" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "320x240" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "320x240" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "320x240" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "400x300" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "400x300" (bad mode clock/interlace/doublescan) ... ... ... (II) GLINT(0): Not using default mode "1600x1200" (bad mode clock/interlace/doublescan) ... ... ... (II) GLINT(0): Not using mode "1600x1200" (no mode of this name) ... ... ... (--) GLINT(0): Virtual size is 1280x1024 (pitch 1280) (**) GLINT(0): *Default mode "1280x1024": 108.0 MHz, 64.0 kHz, 60.0 Hz (II) GLINT(0): Modeline "1280x1024" 108.00 1280 1328 1440 1688 1024 1025 1028 1066 +hsync +vsync (**) GLINT(0): *Default mode "1024x768": 94.5 MHz, 68.7 kHz, 85.0 Hz (II) GLINT(0): Modeline "1024x768" 94.50 1024 1072 1168 1376 768 769 772 808 +hsync +vsync (**) GLINT(0): Default mode "1280x960": 108.0 MHz, 60.0 kHz, 60.0 Hz (II) GLINT(0): Modeline "1280x960" 108.00 1280 1376 1488 1800 960 961 964 1000 +hsync +vsync (**) GLINT(0): Default mode "1152x864": 108.0 MHz, 67.5 kHz, 75.0 Hz (II) GLINT(0): Modeline "1152x864" 108.00 1152 1216 1344 1600 864 865 868 900 +hsync +vsync (**) GLINT(0): Default mode "1152x768": 65.0 MHz, 44.2 kHz, 54.8 Hz
The 1280x1024 line is the highest resolution one that appears with that "Modeline" start, does that mean that it's the highest resolution that Xorg thinks is possible?
Chris Green wrote:
I've taken a look at my Xorg.0.log file and, among other things, it has:-
(--) GLINT(0): FIFO Size is 256 DWORDS (==) GLINT(0): Min pixel clock is 16 MHz (--) GLINT(0): Max pixel clock is 110 MHz (II) GLINT(0): LGFlatron: Using hsync range of 30.00-107.00 kHz (II) GLINT(0): LGFlatron: Using vrefresh range of 50.00-200.00 Hz (II) GLINT(0): Clock range: 16.25 to 110.00 MHz (II) GLINT(0): Not using default mode "320x175" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "320x200" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "360x200" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "320x240" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "320x240" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "320x240" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "320x240" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "400x300" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "400x300" (bad mode clock/interlace/doublescan) ... ... ... (II) GLINT(0): Not using default mode "1600x1200" (bad mode clock/interlace/doublescan) ... ... ... (II) GLINT(0): Not using mode "1600x1200" (no mode of this name) ... ... ... (--) GLINT(0): Virtual size is 1280x1024 (pitch 1280) (**) GLINT(0): *Default mode "1280x1024": 108.0 MHz, 64.0 kHz, 60.0 Hz (II) GLINT(0): Modeline "1280x1024" 108.00 1280 1328 1440 1688 1024 1025 1028 1066 +hsync +vsync (**) GLINT(0): *Default mode "1024x768": 94.5 MHz, 68.7 kHz, 85.0 Hz (II) GLINT(0): Modeline "1024x768" 94.50 1024 1072 1168 1376 768 769 772 808 +hsync +vsync (**) GLINT(0): Default mode "1280x960": 108.0 MHz, 60.0 kHz, 60.0 Hz (II) GLINT(0): Modeline "1280x960" 108.00 1280 1376 1488 1800 960 961 964 1000 +hsync +vsync (**) GLINT(0): Default mode "1152x864": 108.0 MHz, 67.5 kHz, 75.0 Hz (II) GLINT(0): Modeline "1152x864" 108.00 1152 1216 1344 1600 864 865 868 900 +hsync +vsync (**) GLINT(0): Default mode "1152x768": 65.0 MHz, 44.2 kHz, 54.8 Hz
The 1280x1024 line is the highest resolution one that appears with that "Modeline" start, does that mean that it's the highest resolution that Xorg thinks is possible?
Quite possibly. In mine I remember seeing a section where it changed the specified horiz and vert sync frequencies to meet some specified recommendation CCE or something. Maybe it has doen something similar to yours which results in the change in max resolution.
Ian
On Thursday 25 November 2004 9:00 pm, Chris Green wrote:
I've taken a look at my Xorg.0.log file and, among other things, it has:-
(--) GLINT(0): FIFO Size is 256 DWORDS (==) GLINT(0): Min pixel clock is 16 MHz (--) GLINT(0): Max pixel clock is 110 MHz (II) GLINT(0): LGFlatron: Using hsync range of 30.00-107.00 kHz (II) GLINT(0): LGFlatron: Using vrefresh range of 50.00-200.00 Hz (II) GLINT(0): Clock range: 16.25 to 110.00 MHz (II) GLINT(0): Not using default mode "320x175" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "320x200" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "360x200" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "320x240" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "320x240" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "320x240" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "320x240" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "400x300" (bad mode clock/interlace/doublescan) (II) GLINT(0): Not using default mode "400x300" (bad mode clock/interlace/doublescan) ... ... ... (II) GLINT(0): Not using default mode "1600x1200" (bad mode clock/interlace/doublescan) ... ... ... (II) GLINT(0): Not using mode "1600x1200" (no mode of this name) ... ... ... (--) GLINT(0): Virtual size is 1280x1024 (pitch 1280) (**) GLINT(0): *Default mode "1280x1024": 108.0 MHz, 64.0 kHz, 60.0 Hz (II) GLINT(0): Modeline "1280x1024" 108.00 1280 1328 1440 1688 1024 1025 1028 1066 +hsync +vsync (**) GLINT(0): *Default mode "1024x768": 94.5 MHz, 68.7 kHz, 85.0 Hz (II) GLINT(0): Modeline "1024x768" 94.50 1024 1072 1168 1376 768 769 772 808 +hsync +vsync (**) GLINT(0): Default mode "1280x960": 108.0 MHz, 60.0 kHz, 60.0 Hz (II) GLINT(0): Modeline "1280x960" 108.00 1280 1376 1488 1800 960 961 964 1000 +hsync +vsync (**) GLINT(0): Default mode "1152x864": 108.0 MHz, 67.5 kHz, 75.0 Hz (II) GLINT(0): Modeline "1152x864" 108.00 1152 1216 1344 1600 864 865 868 900 +hsync +vsync (**) GLINT(0): Default mode "1152x768": 65.0 MHz, 44.2 kHz, 54.8 Hz
The 1280x1024 line is the highest resolution one that appears with that "Modeline" start, does that mean that it's the highest resolution that Xorg thinks is possible?
I still maintain that either you need a modeline for 1600x1200 or the one you have is incorrect.
It looks to me that at the start of that log all common resolutions that don't have a modeline (or have a broken modeline) are rejected. It then starts the highest available resolution that does have a valid modeline.
On Thu, Nov 25, 2004 at 09:37:06PM +0000, Wayne Stallwood wrote:
I still maintain that either you need a modeline for 1600x1200 or the one you have is incorrect.
It looks to me that at the start of that log all common resolutions that don't have a modeline (or have a broken modeline) are rejected. It then starts the highest available resolution that does have a valid modeline.
On looking harder at my xorg.conf file I don't have any modelines of the format you seem to have. So I don't have any "available resolution that does have a valid modeline". Hmmm.
Maybe I'll read a bit more about the xorg.conf rules.
On Thu, Nov 25, 2004 at 09:50:09PM +0000, Chris Green wrote:
On Thu, Nov 25, 2004 at 09:37:06PM +0000, Wayne Stallwood wrote:
I still maintain that either you need a modeline for 1600x1200 or the one you have is incorrect.
It looks to me that at the start of that log all common resolutions that don't have a modeline (or have a broken modeline) are rejected. It then starts the highest available resolution that does have a valid modeline.
On looking harder at my xorg.conf file I don't have any modelines of the format you seem to have. So I don't have any "available resolution that does have a valid modeline". Hmmm.
Maybe I'll read a bit more about the xorg.conf rules.
... and I'm none the wiser. All the HOWTOs and so on are very out of date and most imply that nowadays one shouldn't have to do anything to get the best resolution from ones monitor.
I'm not even *absolutely* sure that my video card can do 1600x1200, the monitor certainly can because I'm looking at 1600x1200 now but that's from Win2k on other hardware.
My video card is a "Fire GL1000 Pro" which uses the GLINT drivers, does anyone know how I can find out what the best resolution it supports is?
Hi Chris
Install ddcxinfo-knoppix and run "ddcxinfo-knoppix -monitor". This will return all the modes that your monitor and graphics card can support..
You may need to add the following to your apt sources: http://www.morphix.org/debian ./
Regards, Paul.
On Thursday 25 November 2004 22:35, Chris Green wrote:
m not even *absolutely* sure that my video card can do 1600x1200, the monitor certainly can because I'm looking at 1600x1200 now but that's from Win2k on other hardware.
My video card is a "Fire GL1000 Pro" which uses the GLINT drivers, does anyone know how I can find out what the best resolution it supports is?
On Thu, Nov 25, 2004 at 11:00:37PM +0000, Paul wrote:
On Thursday 25 November 2004 22:35, Chris Green wrote:
m not even *absolutely* sure that my video card can do 1600x1200, the monitor certainly can because I'm looking at 1600x1200 now but that's from Win2k on other hardware.
My video card is a "Fire GL1000 Pro" which uses the GLINT drivers, does anyone know how I can find out what the best resolution it supports is?
Install ddcxinfo-knoppix and run "ddcxinfo-knoppix -monitor". This will return all the modes that your monitor and graphics card can support..
You may need to add the following to your apt sources: http://www.morphix.org/debian ./
OK, thanks, I may try that.
However I think I'm nearly there, I've changed the "DefaultDepth" line in my xorg.conf file from 24 to 16 and now I get 1600x1200 on the local screen - hurrah!
I presume this means that my Fire GL1000 Pro card *can't* support 1600x1200 with 24-bit colour depth. That's where assuming things gets you to.
So all I want to sort out now (for the moment anyway) is my printer fonts.
On Thursday 25 November 2004 11:11 pm, Chris Green wrote:
I presume this means that my Fire GL1000 Pro card *can't* support 1600x1200 with 24-bit colour depth. That's where assuming things gets you to.
It "should" do 8MB is enough memory for 24bit at 1600x1200
1600x1200 = ~ 1.9 megapixels x 3 bytes per pixel (for 24bit colour) = 5625 KB Even allowing for the fact that some cards do the allocation internally at 32bit when asked for 24bit depth you still have enough video ram.
Your RAMDAC is running at 230 MHz which is enough for a 90 Hz refresh at 1600x1200
Technically there is no reason why the card cannot display 1600x1200 with 24bit colour depth.
On Thu, Nov 25, 2004 at 11:56:41PM +0000, Wayne Stallwood wrote:
On Thursday 25 November 2004 11:11 pm, Chris Green wrote:
I presume this means that my Fire GL1000 Pro card *can't* support 1600x1200 with 24-bit colour depth. That's where assuming things gets you to.
It "should" do 8MB is enough memory for 24bit at 1600x1200
1600x1200 = ~ 1.9 megapixels x 3 bytes per pixel (for 24bit colour) = 5625 KB Even allowing for the fact that some cards do the allocation internally at 32bit when asked for 24bit depth you still have enough video ram.
Your RAMDAC is running at 230 MHz which is enough for a 90 Hz refresh at 1600x1200
Technically there is no reason why the card cannot display 1600x1200 with 24bit colour depth.
Exactly, those are the sums I did too, which is why I had assumed the card would support 1600x1200 at 24-bit. However I'm not going to waste any more time on it as for 90% of the time I use the Linux box via the X Server on my Win2k machine. It's just an occasional convenience to switch to viewing it directly and a reduction to 16-bit colour depth isn't a problem. The lower resolution *was* a problem though as it meant my predefined xfce screen layout was overlapping and off the screen.