1.3.13 Beta release

News about GZDoom.

Moderator: Graf Zahl

Whoo
Posts: 32
Joined: Sun Feb 03, 2008 23:28

Re: 1.3.05 Beta release

Post by Whoo »

Graf Zahl wrote:What I can say is that apparently the dynamic light shader uses something that your card can't handle.
Logfile wrote: GL_VENDOR: ATI Technologies Inc.
GL_RENDERER: ATI Radeon HD 4800 Series
GL_VERSION: 2.1.8918
An ATi Radeon card from the HD 4800 series should damn well be able to support dynamic lights.
User avatar
Graf Zahl
GZDoom Developer
GZDoom Developer
Posts: 7148
Joined: Wed Jul 20, 2005 9:48
Location: Germany
Contact:

Re: 1.3.05 Beta release

Post by Graf Zahl »

Orakagi wrote:Am I the only one where the dynamic lighting doesn't work? I didn't even realize that it didn't until I ran Doom on Skulltag and I saw the glowing effects. They work just fine there, just not over GZDoom.

Skulltag does something I don't like: It forces loading of the dynamic light definitions. In GZDoom they are optional. Although the definitions for these and the brightmaps come with the download they are not loaded automatically. You have to either do it manually or add them to the 'autoload' section of the .INI file. Not everyone likes the default lights so I consider forcing them bad. I normally play without them and use a reduced light definition set that only assigns lights to projectiles and uses different colors than the standard set. Had I forced the light definitions this wouldn't be possible.

Whoo wrote:
Graf Zahl wrote:What I can say is that apparently the dynamic light shader uses something that your card can't handle.
Logfile wrote: GL_VENDOR: ATI Technologies Inc.
GL_RENDERER: ATI Radeon HD 4800 Series
GL_VERSION: 2.1.8918
An ATi Radeon card from the HD 4800 series should damn well be able to support dynamic lights.
The card can easily handle the non-shader lights. But the dynamic light shader needs to use dynamic 'for' loops and I know this is something that wasn't part of the original GLSL spec. NVidia seems to have extended the language's capabilities here but ATI doesn't as it appears. Bad luck for you. Having this construct working is an absolute minimum requirement for the dynamic light shader. To be honest, I was quite surprised that it works with GLSL 1.2 on my GF8600. I thought it'd only be available on later hardware. or GLSL versions. My problem is, I have to use 1.2 because the shader framework must also work on older hardware that can't do GL 3.0 (and thus GLSL 1.3 or later.) Some of the code must be user configurable but that can't be done if I have to maintain everything in 2 or more variations needed by the different GLSL version's syntactic differences.

For a final confirmation I'll have to wait though what the next beta outputs. I'll have to change the error reporting code because ATI does it differently than NVidia.
User avatar
Nash
Developer
Developer
Posts: 1226
Joined: Sun Sep 25, 2005 1:49
Location: Kuala Lumpur, Malaysia
Contact:

Re: 1.3.05 Beta release

Post by Nash »

Confirming that shaders for dynamic lights doesn't work on my ATI 4870 X2. I just upgraded to the latest ATI drivers (9.10), still no go.

I think it's a load of bollox that such an expensive graphics card can't handle this feature...
User avatar
Graf Zahl
GZDoom Developer
GZDoom Developer
Posts: 7148
Joined: Wed Jul 20, 2005 9:48
Location: Germany
Contact:

Re: 1.3.05 Beta release

Post by Graf Zahl »

If your driver does not want to compile the shader it's not my fault. If what I suspect is the cause the alternative would be to remove the feature again. But I'm not going to do that because on NVidia it works.

I can't rewrite the shader without the 'for' loop. I'd have to completely unroll it which would make it unmaintainable.
Chris
Posts: 29
Joined: Fri Nov 28, 2008 1:18

Re: 1.3.05 Beta release

Post by Chris »

Graf Zahl wrote:But the dynamic light shader needs to use dynamic 'for' loops and I know this is something that wasn't part of the original GLSL spec. NVidia seems to have extended the language's capabilities here but ATI doesn't as it appears. Bad luck for you. Having this construct working is an absolute minimum requirement for the dynamic light shader. To be honest, I was quite surprised that it works with GLSL 1.2 on my GF8600. I thought it'd only be available on later hardware. or GLSL versions.
Dynamic branching (ie. non-unrolloable for or while loops) in pixel shaders have been available from the Geforce 6 series and up. It's part of the Shader Model 3.0 spec, so any SM3-capable/DX9-based card should be able to do it. Unrollable for/while loops, where the number of iterations is known at compile time, can even be done before that.
User avatar
Gez
Developer
Developer
Posts: 1399
Joined: Mon Oct 22, 2007 16:47

Re: 1.3.05 Beta release

Post by Gez »

Chris wrote:It's part of the Shader Model 3.0 spec, so any SM3-capable/DX9-based card should be able to do it.
Wikipedia claims that DX9-compatibility is not a guarantee of SM3-capability, since it's specifically Direct3D 9.0c that supports them. So a Radeon R420 supports the earliest version of DirectX 9, but is not enough. With Radeon R520 and higher, it should be okay (any ATI card with a number of X1000 or greater, or a HD-anything, should work).
User avatar
Graf Zahl
GZDoom Developer
GZDoom Developer
Posts: 7148
Joined: Wed Jul 20, 2005 9:48
Location: Germany
Contact:

Re: 1.3.05 Beta release

Post by Graf Zahl »

This all doesn't help if the driver does not support it. Dynamic branching is not mandatory part of the GLSL 1.2 spec if I remember correctly. And there's no extension handling this. So ATI is just sticking to the specs it seems. Which doesn't help me because using GLSL 1.3 would mean to upgrade to GL 3.0 which I can't do. Because then I'd have to make checks throughout the code for things I don't want to do. Most importantly, this would mean 2 sets of shaders for everything.

And a minor enhancement feature is not worth the work that'd be required here. The old texture based lighting code is still there after all.
User avatar
Nash
Developer
Developer
Posts: 1226
Joined: Sun Sep 25, 2005 1:49
Location: Kuala Lumpur, Malaysia
Contact:

Re: 1.3.05 Beta release

Post by Nash »

Well that sucks. While I can afford to switch to an nvidia (hell I can afford to build a totally new rig if I wanted to), I think it is really, really dumb to do such a thing just to use shader lights for GZDoom.

(Plus my graphics card isn't obsolete... heck it hasn't even been a year yet. And it was bloody expensive)

I don't really like where this is going... :/

EDIT: I'm not blaming you Graf. I know it's not your fault. I'm just frustrated with this piece of shit ATI... it's been giving me a lot of problems. I'm not in a position to trash it though because no one wants to buy it and it's so not worth it to sell it off anyway... so I'm stuck with no other option but to use it until it's outlived its usefulness. I can so just switch to an nvidia right away but then I can't live with the fact that I have two pieces of graphics cards that are worth four digits just collecting dust in my room...
User avatar
Firebrand
Dev Builds Team
Posts: 126
Joined: Mon Aug 10, 2009 21:00
Location: Mexico
Contact:

Re: 1.3.05 Beta release

Post by Firebrand »

Unfortunately that's a BIG problem with ATI cards, the drivers just suck!
I'm the ruler of the Fire Power.....
User avatar
Gez
Developer
Developer
Posts: 1399
Joined: Mon Oct 22, 2007 16:47

Re: 1.3.05 Beta release

Post by Gez »

http://www.opengl.org/wiki/Detecting_the_Shader_Model
"ATI does not support higher than SM 2.0 functionality in assembly shaders."

Hmm.

http://jegx.ozone3d.net/index.php?entry ... 529-094845
"vec2 is the GLSL type to hold a 2d vector. vec2 is supported by NVIDIA and ATI. float2 is a 2d vector but for Direct3D HLSL and for Cg. The GLSL compilation for Geforce is done via the NVIDIA Cg compiler. Here is the GLSL version displayed by GPU Caps Viewer: 1.20 NVIDIA via Cg compiler. That explains why a GLSL source that contains a float2 is compilable on NVIDIA hardware. But the GLSL compiler of ATI is strict and doesn't recognize the float2 type."

Is that still true? (It was written in 2007.)

More ATI vs. Nvidia showdown:
http://www.ozone3d.net/blogs/lab/200801 ... idia-part/
http://www.pouet.net/topic.php?which=5495
http://episteme.arstechnica.com/eve/for ... 2001358831
User avatar
Graf Zahl
GZDoom Developer
GZDoom Developer
Posts: 7148
Joined: Wed Jul 20, 2005 9:48
Location: Germany
Contact:

Re: 1.3.05 Beta release

Post by Graf Zahl »

Sadly yes. Both NVidia and ATI managed to thoroughly mess this up.

We'll just have to wait and see what the actual error here is. I added some code that gets the compile result for the separate shaders as well for the next beta.
User avatar
BlazingPhoenix
Posts: 488
Joined: Sun Aug 28, 2005 5:11
Contact:

Re: 1.3.06 Beta release

Post by BlazingPhoenix »

I get this weird graphical glitch when I try to play with wads such as Demon Eclipse with the 1.3.x betas: http://img101.imageshack.us/img101/135/ ... 910281.png
User avatar
Enjay
Developer
Developer
Posts: 4747
Joined: Tue Aug 30, 2005 23:19
Location: Scotland
Contact:

Re: 1.3.06 Beta release

Post by Enjay »

That looks like it might be the same issue that chopkinsca reported. Are you on an ATI card too?
User avatar
Graf Zahl
GZDoom Developer
GZDoom Developer
Posts: 7148
Joined: Wed Jul 20, 2005 9:48
Location: Germany
Contact:

Re: 1.3.06 Beta release

Post by Graf Zahl »

Please post a savegame, system specs and a link to the version of this mod you used.
User avatar
BlazingPhoenix
Posts: 488
Joined: Sun Aug 28, 2005 5:11
Contact:

Re: 1.3.06 Beta release

Post by BlazingPhoenix »

Enjay wrote:That looks like it might be the same issue that chopkinsca reported. Are you on an ATI card too?
Yes.
Graf Zahl wrote:Please post a savegame, system specs and a link to the version of this mod you used.
http://files.drdteam.org/index.php/file ... -ic209.zip - the DE incomplete beta
http://www.sendspace.com/file/p47p9c - savegame

..and I forgot how to post my system specs. >_<
But I do know my card is in the ATI Radeon X1300/X1550 series
Locked

Return to “News”