SVN 11936 - Messed up graphics, segfault on start

Find any bugs in Vega Strike? See if someone has already found it, or report them here!
Post Reply
Miramor
ISO Party Member
ISO Party Member
Posts: 410
Joined: Tue Jun 26, 2007 7:15 pm

SVN 11936 - Messed up graphics, segfault on start

Post by Miramor »

When I start the latest Vega Strike SVN, the load splash screen isn't displayed right - it looks like a stereogram, with a blurred vertical slice of the image repeated horizontally. I think this might have to do with the recent changes to software compression.

I would have a screenshot, but the splash image doesn't last long enough - almost immediately after it is displayed, the game segfaults.

For the record, I'm currently using the old i810 driver for Xorg, instead of the new intel one - i810 is much faster and more stable, and I was hoping it might help the performance problems I had been experiencing before.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

old intel may not support any hardware DDS loading. which you know already.

It would be interesting if you could determine what image is not displaying correctly. Are all the images not displaying correctly?

The fix i made to the software DDS code would only be causing errors if some other part of the code is broken. If that code was broken, i wouldn't be able to get good spheremaps, and i do.

I just made some fixes to data4.x that had invalid DDS files (below minimum size) that the software decoder would not enjoy. These may be getting loaded and causing you to have weird memory issues ...i dont know.

Update data4.x and try again. Be sure to remove spheremaps in ~/.vegastrike/textures/backgrounds i'm not sure if they get regenerated every time you run the program.


Without knowing which files are bugging out for you, or a pattern of bugginess, it's impossible to tell if the driver is at fault or the software decoder or some other handler
Ed Sweetman endorses this message.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

I just fixed your issue. I'm stupid.

SDDS (software dds decompressor) always outputs 32bit. No matter what. But in GL land, DXT1 files are correctly seen as 24bit. This was causing a conflict when software decoding was used for env_map but hardware decoding for the rest. Thus, it was "fixed" when i made it behave like the hardware decompressor, but that broke it for software decompression.

My fix is to rewrite the image attributes when we do software decoding, so that it's all correct for the new image data.

The reason i thought SDDS needed to be fixed in the first place was because the texture transform code was not behaving correctly when i had an alpha channel. I must have been half asleep to not notice that the format argument had to be PNG_HAS_COLOR *AND* PNG_HAS_ALPHA in order for it to work correctly with my image data.

Now both software and hardware modes are correct and background lightmaps are correct again. I tested both in software decode mode and hardware and everything looked correct.

Merry christmas.

ps. Using software decode mode yields uncompressed textures, that's over 1.1GB of textures. I can't imagine your performance being any better using a card/driver that doesn't support DXT
Ed Sweetman endorses this message.
Post Reply