Page 1 of 1
[fixed]CameraTexture Hudmessage Problem
Posted: Fri Feb 17, 2006 17:44
by Skadoomer
Here is my problem:
I'm printing a camera to the screen using the hudmessage function. It works in software mode on gzdoom, but fails to print whenever i'm using opengl. I've included and example of what i mean. Any suggestions on whats going wrong, or is this one of those things like warping hi-res textures that has just been purposely disabled?
Posted: Fri Feb 17, 2006 17:54
by Graf Zahl
You just do something that was never intended. The code that validates camera textures is never executed when using them on a HUD message.
Posted: Fri Feb 17, 2006 19:41
by Nash
That's a nifty hack.
I think it should be "fixed" so it works in OpenGL...
Posted: Fri Feb 17, 2006 20:33
by Skadoomer
Graf Zahl wrote:You just do something that was never intended. The code that validates camera textures is never executed when using them on a HUD message.
So is this a "Don't do that" then or can i feature suggest it?
Posted: Fri Feb 17, 2006 22:11
by Graf Zahl
It's a 'I never thought of that' issue.
Posted: Sat Feb 18, 2006 3:10
by Skadoomer
well fancy that. Is it anything extensive to add in to the opengl side of things? I've already found a few cool uses of this trick in software mode that i'd really like to keep.
Posted: Sat Feb 18, 2006 5:40
by chopkinsca
The only problem I see in software mode is how it works when you are viewing the auto-map. Just thought I would mention that in case it was a bug in itself. (The last viewed image before going into auto-map flips between normal orientation and 90 degrees every second or so.)
Posted: Sat Feb 18, 2006 10:09
by Graf Zahl
Skadoomer wrote:well fancy that. Is it anything extensive to add in to the opengl side of things? I've already found a few cool uses of this trick in software mode that i'd really like to keep.
No. It is minimal. I maintain 2 sets of textures - one for world geometry and one for sprites and 2D-graphics. This is necessary so that on cards that don't support non-power-of-2-textures there aren't any scaling artifacts.
For modern cards I could map them together but considering that textures are very rarely used for both at the same time I don't think it's worth the effort.
But for camera textures there is only a world texture so I'd have to add some explicit coding to map them together. Don't worry. I won't do it immediately but it will be fixed for the next version.
Posted: Sat Feb 18, 2006 10:11
by Graf Zahl
chopkinsca wrote:The only problem I see in software mode is how it works when you are viewing the auto-map. Just thought I would mention that in case it was a bug in itself. (The last viewed image before going into auto-map flips between normal orientation and 90 degrees every second or so.)
Interesting. That is a result of the format ZDoom stores textures in. The buffer doesn't get updated and gets flipped every frame. Best report this to Randy as it is a bug in the camera texture code.
It won't happen with GL because I don't flip textures like this.
Posted: Sun Feb 26, 2006 23:14
by Graf Zahl
Fixed