4.4 release work?

Development directions, tasks, and features being actively implemented or pursued by the development team.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

The next item up is the dds cacheing. We are going to want to be able to cache all the compressed textures that VS creates from the jpg/png files not sent as dds so that subsequent runs of VS load extremely fast. Since they get compressed by hardware in game anyway, there should be no decrease in quality. This caching will be dependent on what textures VS compresses. Everytime it compresses a texture, it writes it to the cache. Next time it loads, it wont have to compress it.

I'm going over more details over how i want the caching to work. Chime in if you want.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

OK guys. You're gonna see about a 200MB increase in the ram VS uses. This is due to PNG compression bypass. This is a patch that preceeds the change over how we format textures in VS.

png type files are going to be used for special textures that require absolute lossless quality. Those textures are basically normal maps used for bump mapping. Really nothing else should be a png. _everything_ else in the data4.x dir will be dds files. Compressed with whichever compressor produces the best dds file for a given texture file.


There will be a master branch made for the source versions of all the textures in the data4.x dir. These files should be XCF gimp images or if you must, photoshop images or at the very least, a lossless format like png.

You'll notice VS loads like it's on crack now, it'll load even faster when we convert everything over to dds.

For those worried that you wont be able to play in extreme detail with extreme detail anymore, dont worry. You can simply grab the Master textures from the master texture branch and replace the ones in your data4.x/5.x dir. This means a double download for you, but that shouldn't be as bad as you'd think, dds files compress very well with bzip/gzip/etc to the point where it's barely bigger than a jpeg.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

as of -r11382 the data4.x dir has 660 jpeg files and 861 png files and 24 bitmaps.

That means there is likely well over 1000 files that will have to be converted to dds prior to release. All of those has to be looked at and checked to make sure they are in the correct dds format and look good.

lots of work ahead.



On top of that, there are so many files with extensions that have nothing to do with the actual image type that they are that i'm starting to consider beginning a change over to a common extension. just so people dont get the bright idea to use something that depends on the extension to identify the file.

nvcompress is guilty of that.


Pretty soon we'll be able to get rid of splash screens, vs will load so fast you'll barely get to check it out.

already it only is able to load up 1 before it finishes loading. and that's with the 660 jpegs it has to decompress and recompress and bitmaps and the 800+ png's it has to decompress. it's only going to get faster the more and more those are turned into dds files.
ace123
Lead Network Developer
Lead Network Developer
Posts: 2560
Joined: Sun Jan 12, 2003 9:13 am
Location: Palo Alto CA
Contact:

Post by ace123 »

wow, sounds like you are really making progress! I noticed the fast loading times lately.

Anyway in the latest update I'm seeing that the basecomputer texture shows up corrupted. I also see a white atmosphere. Any ideas?

I'm using an NVidia 8600GTS with 256MB VRAM.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

fixing it when i get off work. I was too enthusiastic and was only looking for visually transparent textures (since i've seen animation textures that were with alpha channel but didn't need them) ...from now on anything with alpha channel as far as /usr/bin/file tells me will get dxt5. later on if it's found that it doesn't need alpha, we can make it dxt1.

sorry for the inconvenience, if you absolutely can't wait until tonight for the fixes, just merge backwards to 11380 in the /texture directory. you can update it back to the latest version tonight.
chuck_starchaser
Elite
Elite
Posts: 8014
Joined: Fri Sep 05, 2003 4:03 am
Location: Montreal
Contact:

Post by chuck_starchaser »

You only need DXT5 if the alpha value varies continuously. In most cases, I believe, alpha is used for full transparency or none; and DXT1 with 1 alpha bit can do that, I believe.

BTW, I've read reports of people using DXT5 for compressing normal maps to dds with acceptable quality. The problem is, normal maps use the red and green channel for du and dv, and both require more precision than the blue channel (which is almost redundant anyways, as it could be computed in the shader off of du and dv); but DXTn compression dedicates more bits to the green (and alpha in DXT5) channels than it does to the red and blue. So the trick is to use DXT5 but to first move du from the red channel to the alpha channel. And this frees up the red channel, which can then be used for height info. Dv is already in the green channel, so no problem there. Du's changed position only requires a swizzle in the shader. Due to blue channel imprecision, the normal vector needs to be renormalized, though; but renormalizing is a trivial single op for a GLSL shader.

EDIT:
And I would emphasize that the decision to use DXT5 for normal maps is a very important one to make right now that (long live!) Hellcat has come with the first shaders for the vs engine, and before we come to depend on a collection of shaders that can't use DXT5-compressed normal maps. Otherwise we may end up with a doubling of the number of shaders: One set for uncompressed normal maps; and another for compressed normal maps. Also, when the Ogre branch moves forward, it will need height info for paralax, so any normal maps people create without height info would have to be redone. Better, IMO, to establish a standard right now, --namely that png normal maps *must* have height info in the alpha channel; and provide an off-line tool that swaps the red and alpha channels of a png prior to feeding it to the dxt5 compressor.
DualJoe
ISO Party Member
ISO Party Member
Posts: 387
Joined: Tue Mar 21, 2006 2:37 pm
Location: Netherlands
Contact:

Post by DualJoe »

Will dds with mip-maps be supported? If so, would it be possible to link mipmap-levels with the LODs.
You could then replace certain mip-maps in the DDS-file with the correct texture for the LOD. This way you'd only need 1 set of DDS-files for diff, spec etcetera per model instead of having seperate sets of files for each LOD.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

all the dds files have mipmaps thus far.

The mipmaps are used for scaling, not for containg multiple separate images. Each mipmap is half the size of the one before. Thats how it's done.


Right now we bypass compression for png, so normal maps can be made png (nothing else) and we can use them without fear of bad dxtN compression. I have not tested normal maps with compression via swapping channels yet, maybe tonight.
gimp-dds has that option
chuck_starchaser
Elite
Elite
Posts: 8014
Joined: Fri Sep 05, 2003 4:03 am
Location: Montreal
Contact:

Post by chuck_starchaser »

@Dual: You can't tie mipmaps to lod's, because mipmaps are handled in hardware trilinear filtering. What we could use is separate textures for lod's, which I think the engine supports already; each with its own mipmapping.

@Safemode: One thing I found while working on a ship called the Demon, was that the normal map had better be twice the resolution of the diffuse and specular. Take a rivet, for example. A single texel is enough resolution to give it material properties; but its curvature needs at least four normal samplings. I think we could standardize to larger normal maps with compression, which look better than half the size without compression, anyways.

EDIT:
Here we go:

Image

The textures are only 512, except the normal map, which is 1024.
Each rivet is a single texel in diffuse and specular, but gets four normal map texels.

EDIT2:
I'm having doubts about normal map compression, on second thought...
Problem is, a rivet needs a symmetric normals spread on two axes, but the way DXT compression works is by picking two colors for each set of 4x4=16 texels, and interpolating between them; but 2 colors couldn't represent full du/dv spread... Unless dxt5 picks 2 extra interpolation colors for the alpha channel... I think it does... Never mind; it should work well.
DualJoe
ISO Party Member
ISO Party Member
Posts: 387
Joined: Tue Mar 21, 2006 2:37 pm
Location: Netherlands
Contact:

Post by DualJoe »

I now how mip-maps are generated normally and that they are used for scaling. And btw, what use do LODs have other than efficient scaling?

The trick I was talking about is not that uncommon AFAIK (at least the manual editing of mipmaps). The tools for this trick are readily available for download either by NVIDIA or ATI and they allow this manual editing of pre-generated mipmaps.

My question was if it would be possible to link certain mip-map levels to the levels of LODs. Both are governed by distance and screensize if I understand correctly. I don't know anything about the inner workings of graphic-cards / opengl, so I have no idea if this is possible or not. It also wouldn't make life any easier for the artists. It would make it easier to maintain ships and LOD's in the units.csv and reduce the number of needed texture-files.

I do know that using the texture of the high-res model on the LOD's is not an option. Both because the LOD's (should) have different uv-layouts and simply scaling down high-res textures does not guarantee the same quality .

Are you sure about storing the original textures as xcf? The required storage-space will be enormous. Blender has nodes for a while now, which allow for pretty much all layer and filter actions you'd need The Gimp for. Blender has some definite advantages over the Gimp however. For one much higher quality (32-bits per channel). It also allows the use of Blenders procedurals and certain 3d-effects. Most important of all however is the reduction of needed storage-space.
Take for example my hornet, the original xcf-files are 80Mb each for the different maps (diff, spec etc). After I converted the xcf-files to .blend I got much higher quality results with a file just below 3mb. And that one single .blend-file contains all the needed textures and the model plus LOD's.

EDIT
Nevermind about the LODs, Chuck posted the answer while I was typing.
chuck_starchaser
Elite
Elite
Posts: 8014
Joined: Fri Sep 05, 2003 4:03 am
Location: Montreal
Contact:

Post by chuck_starchaser »

DualJoe wrote:I now how mip-maps are generated normally and that they are used for scaling. And btw, what use do LODs have other than efficient scaling?

The trick I was talking about is not that uncommon AFAIK (at least the manual editing of mipmaps). The tools for this trick are readily available for download either by NVIDIA or ATI and they allow this manual editing of pre-generated mipmaps.
Manual editing of mipmaps is an orthogonal issue.
My question was if it would be possible to link certain mip-map levels to the levels of LODs. Both are governed by distance and screensize if I understand correctly.
Yes, they are both related to screen size, but LOD's are a software thing, whereas mipmaps are a hardware thing; they are used by the gpu in ways you have no control of. In any case, if your lod's call for the same texture as the top lod, the appropriate mipmaps for that distance will be used automatically.
Are you sure about storing the original textures as xcf? The required storage-space will be enormous. Blender has nodes for a while now, which allow for pretty much all layer and filter actions you'd need The Gimp for. Blender has some definite advantages over the Gimp however. For one much higher quality (32-bits per channel). It also allows the use of Blenders procedurals and certain 3d-effects. Most important of all however is the reduction of needed storage-space.
Take for example my hornet, the original xcf-files are 80Mb each for the different maps (diff, spec etc). After I converted the xcf-files to .blend I got much higher quality results with a file just below 3mb. And that one single .blend-file contains all the needed textures and the model plus LOD's.
I'm sold on Blender nodes, already; just that not everybody will use Blender; I was simply meaning "Image Sources" as opposed to final textures.
EDIT
Nevermind about the LODs, Chuck posted the answer while I was typing.
Lol, and I didn't see your edit :D
The server is a disaster these days for me; keep getting blank pages... Is it just me?
DualJoe
ISO Party Member
ISO Party Member
Posts: 387
Joined: Tue Mar 21, 2006 2:37 pm
Location: Netherlands
Contact:

Post by DualJoe »

chuck_starchaser wrote:The server is a disaster these days for me; keep getting blank pages... Is it just me?
I was just asking myself the same question and then I saw this
http://vegastrike.sourceforge.net/forum ... php?t=9306

We're not alone
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

as far as compressed files having non-scaled oriented mipmaps (each is half of the one before) I think you can pretty much forget about that. We'd need a way to know when we're dealing with these special dds files and we'd need a way to tell how big the diffent mipmaps are if they're not half the size of the one before. The added complexity isn't nearly worth the also added effort of manually generating the mip maps for a dds file.


Better solutions exist.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

Blah. sourceforge's svn is SOooooooo slow and every now and then just stops responding during large commits/updates. A big spite dir update was supposed to have gone through while i was at work but it appears that didn't happen.


More updates coming when i get home then.

EDIT: perhaps i spoke too soon. At least some of what i setup to commit is still going through. Guess it's just going very slowly.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

Apparently, the big problem with planets showing up as white rather than in the correct textured form was due to an option in the config file. Or at least, it is worked around via the option, i'm not sure if the option is invalid, or if the code isn't working correctly with the option.


<var name="mipmapdetail" value="1"/>

Whenever that variable is set to anything higher than 1, DDS textures produce a white nothingness. I dont know why exactly yet, whether it should produce something or whether the option is irrelevent for dds files, someone will have to enlighten me.


In any case, data5.x-test has a config file with the various quality levels set to have that variable to 1. This may or may not be appropriate.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

Since the mipmap issue has been discovered. Let me fill anyone in who hasn't been following the various threads and such.

nvcompress, the pride of nvidia for compressing high quality dds files doesn't handle non-square textures correctly. once one dimension reaches 1, nvcompress ends, leaving the other dimension at whatever size it was at.

The problem is, GL needs mipmaps all the way down to 1x1, or it flips. For non-square this means once a dimension reaches 1, and the other isn't, the dimension at 1 stays at one while the other side halves until it reaches 1 as well.

I patched nvcompress to do this and regenerated every file i found that was non-square.

To handle any textures i missed or in the future are created with the non-patched nvcompress, i have a workaround that will display a warning and create fake mipmaps for the missing ones so that the game will work, though probably wont look great until you post the offending texture name to the forum so a correct file can be generated.

So the whole problem with crashes and white textures with DDS files has been solved and linked directly to missing mipmaps caused by a buggy nvcompress.

I'll be posting the patch to nvcompress tonight, as well as merging my local branch changes to head.

In addition to fixing the mipmap issue, the max dimension issue is also being addressed. Some users with really crappy video cards can't display all our textures at full resolution. Or, perhaps we'll make the max dimension linked to a quality level in the future. In any case, when max dimension is less than the dimensions of the texture, we need to choose a correct mipmap that is smaller. This was broken before, but i fixed it in my branch and it'll be merged with the other changes tonight.
What you'll notice is a blurriness on textures that are scaled a lot. If you have the ability, set antistropic filtering to a higher than 0 level. That may help (it's supposed to reduce blurring from anti-aliasing anyway).
It may be advantageous to allow some level of control over the max dimension dependent on quality setting, setting it to 256 saved me over 100MB of ram (from 700ish to below 600MB), This setting will only save memory, not cpu however.



with these two issues squashed. i think it's really about time we make a final list of the show stoppers we need to address, so we can get this release out.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

nvcompress patch is at http://signal-lost.homeip.net/files/nvi ... ture.patch

jump into the nvida texture tools dir and patch it, recompile.


Lots of little fixes to the mipmap code tonight. Fixes for buffer-overruns being the most important. update to at least 11602 before posting any bug reports.
Post Reply