Support Texture Compression?
Moderator: Graf Zahl
-
- Posts: 233
- Joined: Sat Oct 29, 2005 0:40
Support Texture Compression?
It should help improve performance.. although I have no real idea how much of an IQ loss it will have..
Practically all modern video cards support it...
Practically all modern video cards support it...
-
- Posts: 20
- Joined: Sun Jan 01, 2006 22:16
- Location: Ontario, Canada
- Contact:
-
- Posts: 81
- Joined: Mon Sep 26, 2005 17:48
- Location: Here, I hope.
- Contact:
-
- Posts: 233
- Joined: Sat Oct 29, 2005 0:40
-
- Posts: 81
- Joined: Mon Sep 26, 2005 17:48
- Location: Here, I hope.
- Contact:
-
- Posts: 120
- Joined: Wed Aug 31, 2005 6:23
- Location: Somewhere
- Contact:
-
- Posts: 130
- Joined: Sat Oct 08, 2005 19:22
S3TC is only of benefit when your using LOTS of hires graphics.
So if you don't use any of that stuff when playing DOOM then playing with S3TC won't make any difference other than degrade the graphics due to them being so low res. When applied to hires textures (ie anything larger than 512x512) there is no vissible difference provided the engine selects the correct S3TC format for each graphic. There are several types of S3TC texture format, the one you choose depends on the texture in question. For example there are three formats that provide alpha data (1bit, 8bit and something else).
Doomsday supports S3TC but if you can tell the difference (not hard given how low res the sprites are) then you shouldn't use it if you don't play with models & hires textures.
So if you don't use any of that stuff when playing DOOM then playing with S3TC won't make any difference other than degrade the graphics due to them being so low res. When applied to hires textures (ie anything larger than 512x512) there is no vissible difference provided the engine selects the correct S3TC format for each graphic. There are several types of S3TC texture format, the one you choose depends on the texture in question. For example there are three formats that provide alpha data (1bit, 8bit and something else).
Doomsday supports S3TC but if you can tell the difference (not hard given how low res the sprites are) then you shouldn't use it if you don't play with models & hires textures.
-
- Posts: 233
- Joined: Sat Oct 29, 2005 0:40
I'm not using to use S3TC compression for the image files.. I'm saying to use S3TC at runtime (there is code out there to use PNG files and run them under S3TC in OpenGL)smg m7 wrote:No, but why would you need internal image compression in GZdoom when the file formats are much easier and work fine?
There are definately some circumstances where it really does like bad, but this option would have to be toggleable if you so hate it. This is beneficial if you use lots of textures that would effectively consume lots of video memory.. more than you have video memory (doing this will obviously make your system use system memory and that's a performance killer). Obviously video cards with higher memory capacity will have no problem.. but this option would be most beneficial for people that have older cards with limited amounts of memory.SlayeR wrote:S3TC looks ugly as hell, and I notice no performance increase whatsoever.
- Graf Zahl
- GZDoom Developer
- Posts: 7148
- Joined: Wed Jul 20, 2005 9:48
- Location: Germany
- Contact:
-
- Posts: 233
- Joined: Sat Oct 29, 2005 0:40
Well.. you need just a Geforce or Radeon (both of the original varient) to have S3TC support (they just need the drivers that have the support)Graf Zahl wrote:...and those weak cards are mostly too old to support texture compression well. If you just need more memory you can set the texture format to 16 bit and save half of the original amount.
I recall older drivers enabling such a control, but it's not simply available in the Control Panel unless you have a tweaking tool such as Rivatuner to change the settings.
What you may have seen was this:
http://en.wikipedia.org/wiki/S3TC
This was well documented and very obvious to see. There are controls in Rivatuner to enforce DXT3 to up the image quality. This bug never appeared on a Radeon.Wikipedia wrote:Like many modern image compression algorithms, S3TC only specifies the method used to decompress images, allowing implementers to design the compression algorithm to suit their specific needs. The early compression routines were not optimal, and although since greatly improved, hindered early adoption of S3TC by developers. The nVidia GeForce 1 through to GeForce 4 cards also used 16 bit interpolation to render DXT1 textures, which resulted in banding when unpacking textures with color gradients. Again, this created an unfavorable impression of texture compression, not related to the fundamentals of the codec itself.
So.. is there a yes or no to this?
-
- Posts: 20
- Joined: Sun Jan 01, 2006 22:16
- Location: Ontario, Canada
- Contact: