As the title says, GL_WARP_SHADER is set to false in my ini file, and yet the textures, flats (and even sprites?) still use the shaders to handle the warping.
Not sure if this is intentional, as I don't see the option any more in the OpenGL menus.
Edit: Also, I did bring down the console and typed in GL_WARP_SHADER, and it says in quotes; "GL_WARP_SHADER is "FALSE""
Edit 2: I had my friend check it out on his computer, he's having the same issue.
[r1391] GL_WARP_SHADER not working
Moderator: Graf Zahl
- Hellser
- Posts: 7
- Joined: Tue Jul 26, 2011 14:14
[r1391] GL_WARP_SHADER not working
Last edited by Hellser on Sun May 20, 2012 14:34, edited 1 time in total.
- Enjay
- Developer
- Posts: 4748
- Joined: Tue Aug 30, 2005 23:19
- Location: Scotland
- Contact:
Re: [r1391] GL_WARP_SHADER not working
Knowing which graphics card you are using (and other relevant system details like OS and driver version) might help those who know these things answer your question.
A (sort of) related question to which I'm pretty sure the answer will be no - is it possible to disable shader warping on a texture by texture basis? The reason I ask is that I have a texture that warps but I also want it to have a brightmap. I can't have two shaders working on the same surface so it has to be one or the other. However, if I could use non-shader warping on the texture, the drop in animation quality would be acceptable if it allowed the brightmap to work. Either that or I could figure out how to make a many-animation-framed traditional animation that look like warping... somehow.
A (sort of) related question to which I'm pretty sure the answer will be no - is it possible to disable shader warping on a texture by texture basis? The reason I ask is that I have a texture that warps but I also want it to have a brightmap. I can't have two shaders working on the same surface so it has to be one or the other. However, if I could use non-shader warping on the texture, the drop in animation quality would be acceptable if it allowed the brightmap to work. Either that or I could figure out how to make a many-animation-framed traditional animation that look like warping... somehow.
- Hellser
- Posts: 7
- Joined: Tue Jul 26, 2011 14:14
Re: [r1391] GL_WARP_SHADER not working
For relevant system details:
Edit: After looking at the code, I realize that the first section pretty much isn't needed. But it doesn't do anything with the shaders refusing to turn off.
Edit 2: Ryan (Cordell, yes, him), looked at the source code for me when I asked him to check things out, and apparently he found this out:
Spoiler:How things are set up in the wad/pk3:
Spoiler: ANIMDEFSFrom what I can see, there's nothing wrong with the ANIMDEFS (that's the only file in the wad outside of a stationary FLAT -- no animation). The warp shaders ARE working fine, just I cannot turn them off. I prefer the software-style of warping.
Edit: After looking at the code, I realize that the first section pretty much isn't needed. But it doesn't do anything with the shaders refusing to turn off.

Edit 2: Ryan (Cordell, yes, him), looked at the source code for me when I asked him to check things out, and apparently he found this out:
Any particular reason why they cannot be turned off (as seeing a GTX 260 is a SM4)?Ryan Cordell says:
*I'm reading up on GL_Shader..
*and it says this:
*"These will only have an effect on SM3 cards. For SM4 they are always on and for SM2 always off"
- Enjay
- Developer
- Posts: 4748
- Joined: Tue Aug 30, 2005 23:19
- Location: Scotland
- Contact:
Re: [r1391] GL_WARP_SHADER not working
That's why I wondered about your graphics card. I know that with the newer/better featured cards certain shader options are on but with older cards they off (and there are some middle ground cards where the user can decide what works best with their card via the options menu I believe). I wasn't sure if it was possible to force the options via the console for the "always on" cards though. It seems that it isn't.
- Hellser
- Posts: 7
- Joined: Tue Jul 26, 2011 14:14
Re: [r1391] GL_WARP_SHADER not working
Would it be possible to revert back to the previous way of handling the Warp Shaders, unless they had a rendering issue with the current GZDooM? (With Skulltag, yes, old, but still semi relevant it has the option in the menu, and AFAIK, it uses an older GZDooM engine)
- Graf Zahl
- GZDoom Developer
- Posts: 7148
- Joined: Wed Jul 20, 2005 9:48
- Location: Germany
- Contact:
Re: [r1391] GL_WARP_SHADER not working
No, this won't be changed.
The software warping is a fallback for old hardware. The optimized rendering path for SM4 cannot deal well with it because it assumes that textures won't have to be changed once they are set up.
The chance of it completely disappearing is a lot higher. I won't keep support for SM2 and SM3 for the handful of people still sticking to outdated hardware. I'd have no problems whatsoever removing pre-SM3 support right now if the code wasn't still needed for SM3.
The software warping is a fallback for old hardware. The optimized rendering path for SM4 cannot deal well with it because it assumes that textures won't have to be changed once they are set up.
The chance of it completely disappearing is a lot higher. I won't keep support for SM2 and SM3 for the handful of people still sticking to outdated hardware. I'd have no problems whatsoever removing pre-SM3 support right now if the code wasn't still needed for SM3.
- Gez
- Developer
- Posts: 1399
- Joined: Mon Oct 22, 2007 16:47
Re: [r1391] GL_WARP_SHADER not working
Digging in the source is nice and all, but the wiki is here, you know.
- NeuralStunner
- Posts: 253
- Joined: Tue Dec 29, 2009 3:46
- Location: IN SPACE
- Contact:
Re: [r1391] GL_WARP_SHADER not working
Honestly, that's a bit off-putting, considering a texture can have only one shader. So no warp + fullbright-glow*, which has bitten me before.
It seems a bit silly to force a setting just because the hardware is capable. Would you force sound on because the user has a sound card?
* This limitation might be worked around by writing a shader that does both, for each WARP and WARP2. Don't ask me how though.
It seems a bit silly to force a setting just because the hardware is capable. Would you force sound on because the user has a sound card?

* This limitation might be worked around by writing a shader that does both, for each WARP and WARP2. Don't ask me how though.
Dean Koontz wrote:Human beings can always be relied upon to exert, with vigor, their God-given right to be stupid.
Spoiler: System Specs
- Graf Zahl
- GZDoom Developer
- Posts: 7148
- Joined: Wed Jul 20, 2005 9:48
- Location: Germany
- Contact:
Re: [r1391] GL_WARP_SHADER not working
This is the inevitable result of not being able to write clean code that needs to be aware of any eventuality of old hardware being present.
If I could just dump all the old compatibility garbage the shader code would be a lot easier to handle. But I'm not in the position to develop software that needs a modern (and fast) graphics card. Even first generation Shader Model 4 cards show performance issues with the shaders.
If I could just dump all the old compatibility garbage the shader code would be a lot easier to handle. But I'm not in the position to develop software that needs a modern (and fast) graphics card. Even first generation Shader Model 4 cards show performance issues with the shaders.