Support Texture Compression?

Moderator: Graf Zahl

Locked
Deathlike2
Posts: 233
Joined: Sat Oct 29, 2005 0:40

Support Texture Compression?

Post by Deathlike2 »

It should help improve performance.. although I have no real idea how much of an IQ loss it will have..

Practically all modern video cards support it...
lemonzest
Posts: 101
Joined: Wed Aug 31, 2005 19:03
Location: Nottingham, UK

Post by lemonzest »

ST3C hardware compression?
Jin
Posts: 20
Joined: Sun Jan 01, 2006 22:16
Location: Ontario, Canada
Contact:

Post by Jin »

Not really needed IMHO. Since when are textures really that large unless you use a format like BMP? Just don't oversize your textures.
smg m7
Posts: 81
Joined: Mon Sep 26, 2005 17:48
Location: Here, I hope.
Contact:

Post by smg m7 »

Large shies could be problem,but then you just use png or jpeg.
Deathlike2
Posts: 233
Joined: Sat Oct 29, 2005 0:40

Post by Deathlike2 »

lemonzest wrote:ST3C hardware compression?
Yes, that's what I meant. Well, S3TC anyways... I'm definately not talking about image compression formats like .png and .jpg...
smg m7
Posts: 81
Joined: Mon Sep 26, 2005 17:48
Location: Here, I hope.
Contact:

Post by smg m7 »

No, but why would you need internal image compression in GZdoom when the file formats are much easier and work fine?
SlayeR
Posts: 120
Joined: Wed Aug 31, 2005 6:23
Location: Somewhere
Contact:

Post by SlayeR »

S3TC looks ugly as hell, and I notice no performance increase whatsoever.
DaniJ
Posts: 130
Joined: Sat Oct 08, 2005 19:22

Post by DaniJ »

S3TC is only of benefit when your using LOTS of hires graphics.

So if you don't use any of that stuff when playing DOOM then playing with S3TC won't make any difference other than degrade the graphics due to them being so low res. When applied to hires textures (ie anything larger than 512x512) there is no vissible difference provided the engine selects the correct S3TC format for each graphic. There are several types of S3TC texture format, the one you choose depends on the texture in question. For example there are three formats that provide alpha data (1bit, 8bit and something else).

Doomsday supports S3TC but if you can tell the difference (not hard given how low res the sprites are) then you shouldn't use it if you don't play with models & hires textures.
Deathlike2
Posts: 233
Joined: Sat Oct 29, 2005 0:40

Post by Deathlike2 »

smg m7 wrote:No, but why would you need internal image compression in GZdoom when the file formats are much easier and work fine?
I'm not using to use S3TC compression for the image files.. I'm saying to use S3TC at runtime (there is code out there to use PNG files and run them under S3TC in OpenGL)
SlayeR wrote:S3TC looks ugly as hell, and I notice no performance increase whatsoever.
There are definately some circumstances where it really does like bad, but this option would have to be toggleable if you so hate it. This is beneficial if you use lots of textures that would effectively consume lots of video memory.. more than you have video memory (doing this will obviously make your system use system memory and that's a performance killer). Obviously video cards with higher memory capacity will have no problem.. but this option would be most beneficial for people that have older cards with limited amounts of memory.
User avatar
Graf Zahl
GZDoom Developer
GZDoom Developer
Posts: 7148
Joined: Wed Jul 20, 2005 9:48
Location: Germany
Contact:

Post by Graf Zahl »

...and those weak cards are mostly too old to support texture compression well. If you just need more memory you can set the texture format to 16 bit and save half of the original amount.
Deathlike2
Posts: 233
Joined: Sat Oct 29, 2005 0:40

Post by Deathlike2 »

Graf Zahl wrote:...and those weak cards are mostly too old to support texture compression well. If you just need more memory you can set the texture format to 16 bit and save half of the original amount.
Well.. you need just a Geforce or Radeon (both of the original varient) to have S3TC support (they just need the drivers that have the support)

I recall older drivers enabling such a control, but it's not simply available in the Control Panel unless you have a tweaking tool such as Rivatuner to change the settings.

What you may have seen was this:
http://en.wikipedia.org/wiki/S3TC
Wikipedia wrote:Like many modern image compression algorithms, S3TC only specifies the method used to decompress images, allowing implementers to design the compression algorithm to suit their specific needs. The early compression routines were not optimal, and although since greatly improved, hindered early adoption of S3TC by developers. The nVidia GeForce 1 through to GeForce 4 cards also used 16 bit interpolation to render DXT1 textures, which resulted in banding when unpacking textures with color gradients. Again, this created an unfavorable impression of texture compression, not related to the fundamentals of the codec itself.
This was well documented and very obvious to see. There are controls in Rivatuner to enforce DXT3 to up the image quality. This bug never appeared on a Radeon.

So.. is there a yes or no to this?
Jin
Posts: 20
Joined: Sun Jan 01, 2006 22:16
Location: Ontario, Canada
Contact:

Post by Jin »

I'm going to guess it's a no.
Locked

Return to “Closed Feature Suggestions”