The Intel performance situation revisted

Find any bugs in Vega Strike? See if someone has already found it, or report them here!
Miramor
ISO Party Member
ISO Party Member
Posts: 410
Joined: Tue Jun 26, 2007 7:15 pm

The Intel performance situation revisted

Post by Miramor »

Okay, here is the lowdown as of SVN 11940:

- With 945GM chipsets on Linux, good performance around planets and decent performance around most bases and ships can be attained by switching to Software Retro mode, 640x480 resolution, 16 bit color fullscreen. Set shaders to whatever you want; that won't affect performance in any way.

- However, even with these minimal settings, jump points, Confed fighter barracks, and certain ships (e.g. Yeoman, Ox) will bring you down to 5 FPS. In order for the game to be playable, these must be avoided or kept off the viewscreen. Fights around jump points can be particularly ugly, due to low framerates making it difficult to maneuver.

- Notably, some capships do not affect performance as much as others. Gleaners do not reduce the framerate at all; Yeoman class ships, by contrast, will reduce the game to a crawl. It seems as though the effect of a given ship on the game's performance is related to how many sharp angles are visible on it, or perhaps the number of intersections between curved and straight lines.

- Finally: when approaching performance-reducing objects such as wormholes, the framerate does not drop smoothly. Instead it will drop gradually and abruptly rise again - this repeats until it is steady at < 5 FPS. This makes me wonder if the performance problems may be related to the handling of mipmaps.

Hope this helps...
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

mipmaps wont change unless your distance changes, and if this issue only happens when you're relatively close, you'd be able to see the mipmap changing. It'll go from clear, to blurry. with the blurry being lower framerate bound.

If the 945GM doesn't support DDS, you'll have a huge problem playing VS. You'd have had a huge problem playing VS before dds as you'd have no way of compressing the many hundreds of MB's of textures that may be used in a single system. This means _ALOT_ of bouncing from system ram to video ram and possibly back to the hdd.

That aside, we definitely have an issue when drawing planets,and basically any other sufficiently large unit when it is directly in front of your view and close enough to fill your view up. I dont know if it's a detail map problem or what, but even on my crappy Nvidia 6600, it throttles the gpu. I'd be interested to know if this happens the same for your card in non-retro mode ... you'd see a sudden drop in framerate when panning around just off of launching from atlantis, when atlantis is center view. Just off center and it'll bounce back to full framerate. no in-between really.

Really what the performance comes down to is video ram size. If everything fits in video ram you can get 150fps with low end hardware (i consider the 256MB pci-x16 Nvidia 6600 low end because it costs 10 bucks to buy new, and i'm not exaggerating). When you start to use more video memory than you have on the card, framerate drops, a lot. In VS we're definitely memory bound, rather than cpu or gpu. If i overclock my gpu, i get no change. Overclock my video memory, and i get significant improvements.

So i just suggest looking into how much memory you have in the card, and trying out what i asked about above with atlantis and trying 640x480 (you have to manually set that in the config file, stupid vssetup doesn't have the dropdown) in other quality modes. and reporting your findings.

there is definitely something weird going on with displaying significant units. Hopefully it can be found to not be video card specific .....and help us track it down.
Ed Sweetman endorses this message.
Miramor
ISO Party Member
ISO Party Member
Posts: 410
Joined: Tue Jun 26, 2007 7:15 pm

Post by Miramor »

Thanks, I'll try that.

For the record, in this thread:

http://vegastrike.sourceforge.net/forum ... php?t=9905

it's mentioned that part of the problem is that the option for turning off line smoothing in vegastrike.config... doesn't.
Breakable wrote: I took a look in the source code and it seems the smooth_points and smooth_lines variables from config file are not very much respected. They seem to be used for 3d drawing settings - only in this function:

Code: Select all

void GFXVertexList::Draw (enum POLYTYPE *mode,const INDEX index, const int numlists, const int *offsets)

But not for 2d. Like shield lines.

Well it might be that the warp point wireframes might not be affected after all, or I just need to understand more about the vega strike source code but somehow I am really happy with my current performance.
Hopefully Breakable's patch still works, as I'd like to try it at some point.

Re DDS, the situation with that is pretty strange. The hardware is supposed to support it, and the config options for the driver say as much, but glxinfo (and Vega Strike) don't see any DDS support. It looks to me like the XOrg developers may have avoided supporting it, out of fear of patent lawsuits.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

that option does turn it off for some things, the problem is there is a lot of code (caugh, animation ...caugh thus wormholes) where we dont look for the config variable, we just set the settings.

you'd notice it if you had the old shield textures that were just lines. Turn the config setting off in the config file ...now it's jagged. Turn it back on, now it's smooth. which seems to contradict what breakable said. I dont know, in my retro mode fixes, that was something i noticed when i enabled/disabled it when i was looking for more "ugly" displays. It's really hard to get VS to look bad enough so that you think you're playing a game from the mid 90's.


http://dri.freedesktop.org/wiki/S3TC

there's lots of fun out there regarding the patent license issues of S3TC. Nvidia has a license covering all their software (the free ones too) and drivers and such. interestingly enough, some countries may essentially not allow the use of sdds.cpp/.h unless the gimp plugin has a license to use S3TC compression.
Ed Sweetman endorses this message.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

AFAIK, the S3 patent only requires registration and the payment of royalties for graphics driver development, not for applications and encoders, and (again AFAIK) S3 has issued a general license to use S3TC/DXTC for all OpenGL drivers. The HUGE advantage of

http://wiki.secondlife.com/wiki/Talk:Texture_cache

Good enough for me.
Ed Sweetman endorses this message.
Miramor
ISO Party Member
ISO Party Member
Posts: 410
Joined: Tue Jun 26, 2007 7:15 pm

Post by Miramor »

Update: tried it at 640x480 with full detail + smoothing. As expected, performance is slightly reduced in open space. With ships and planets, things aren't much worse - until more than two objects (ships, planets, etc.) are in view at the same time. At that point, framerate goes south. If one of the objects is a planet, then following such an event, performance will drop whenever the planet is in view, even if the other objects are not in view at the same time. So, if you're fighting two Hyenas in orbit, and both cross your screen while you're facing the planet, you'll get performance problems any subsequent time you turn towards the planet, even if both Hyenas are behind you.

These results might not apply to jump points and the capships I mentioned previously, but I didn't try those; I don't feel like crashing my computer today.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

the link i posted was to the library you have to install with instructions on how you have to set Mesa up so that open source DRI drivers will use S3TC compression. By default, i dont believe mesa uses hardware s3tc

That alone may help things for you. Just out of curiosity, could you run VS in windowed mode and mention how much ram vegastrike is using via top or some such utility? The Resident ram usage is what i'm concerned with.

Compression not working == 1.2GB of use.
compression working == less than 600MB


I would think that we'd have to address this issue before we start researching problems with the drawing of units/planets... Because those problems could be explained by the swapping in and out of textures in the case that there's not enough video ram.
Ed Sweetman endorses this message.
Xit
Bounty Hunter
Bounty Hunter
Posts: 186
Joined: Mon Aug 06, 2007 2:34 am
Location: Cambs

Post by Xit »

safemode wrote: Because those problems could be explained by the swapping in and out of textures in the case that there's not enough video ram.

Does that take into account the fact that the planet continues to cause slowdown even after the other unit has gone?

I was trying to think of how we could test to see if this relates to the planet/ship filling screen bug but I think we'd need some special system that only contains 1 planet or 1 planet and 1 ship - has this been done and is there an easy way to do it?
Save The Economy
http://vegastrike.sourceforge.net/forum ... hp?t=10605

My boxes: Dual Opteron 280s, Geforce 7600, 2GB RAM, but waiting for a new PSU! grrr...
500 MHz Compaq laptop that gives DC electric burns
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

It's tricky, it could be memory related from having too much memory in use (as we see a decrease in perf hit when we decrease resolution or overclock the vid memory) or it could be gpu bound and the decreased resolution and overclocked memory simply allows the gpu to do less work and retrieve data faster ...respectively.

it's hard to say ... I haven't had much time to devote to doing a step by step backtrace through the Draw function during slowdown. I just notice my gpu usage throttling and memory usage becoming very heavy all of a sudden, then dropping off all of a sudden when i stop viewing the planet.

Another clue is when i was playing with retro mode, changing the texture size of the planet had little effect. It wasn't until i lowered the resolution that i saw a change in behavior. The effect didn't go away, it wsa just less of a burden on the card at lower resolutions.
Ed Sweetman endorses this message.
Miramor
ISO Party Member
ISO Party Member
Posts: 410
Joined: Tue Jun 26, 2007 7:15 pm

Post by Miramor »

Xit: I get the feeling that the continued slowdown might have to do with keeping stuff cached in RAM - or swap space.

safemode: Thanks, I just installed the compression library. I'll see if that does anything.
spiffariffic
Trader
Trader
Posts: 22
Joined: Mon Feb 18, 2008 2:11 am
Location: Eastern WA, USA

Post by spiffariffic »

You think this may have something to do with the planet rings showing through?
Does that take into account the fact that the planet continues to cause slowdown even after the other unit has gone?
Well, the textures may still be in graphics memory forcing things to be swapped around, slowing the frame rate.

Edit: What Miramor said.
I don't suffer from insanity, I enjoy every minute of it.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

do a glxinfo and see if you grok S3TC if you do then you're golden as a far as that's concerned.
Ed Sweetman endorses this message.
Miramor
ISO Party Member
ISO Party Member
Posts: 410
Joined: Tue Jun 26, 2007 7:15 pm

Post by Miramor »

Code: Select all

$ glxinfo | grep -i s3tc
    GL_EXT_texture, GL_EXT_texture3D, GL_EXT_texture_compression_s3tc, 
    GL_SUN_multi_draw_arrays, GL_S3_s3tc
Unfortunately, there's not much improvement (at least on Software Retro settings). VS loads and exits a little faster (and graphics look ever so slightly less clean), but jump points still bring the game to its knees.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

you wouldn't happen to have vsync enabled would you? perhaps you have to set a config/environment variable to disable vsync .... VSYNC will slow everything down.. from loading the game, to playing the game. I really dont know why it slows loading the game, but it definitely does.

by jump points, do you mean the wireframe, or the open wormhole?
Ed Sweetman endorses this message.
Miramor
ISO Party Member
ISO Party Member
Posts: 410
Joined: Tue Jun 26, 2007 7:15 pm

Post by Miramor »

vsync? I'm afraid I don't know what you're talking about... Is this a VS setting or an X setting?

As for jump points, yeah, I mean the wire frame. Once the individual lines become visible, the FPS cut in half.

(I can forget about the wormhole animation... It doesn't slow down anything and looks fine coming out, but going in it's a single static image going in. Not that I particularly care.)
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

without vsync and with hardware s3tc, you should see crazy fast loading, well under 20 seconds from selected a save game to finishing.


that library i directed you to, may just be software routines as a hardware fallback implemented in Mesa's GL libs. You might have to do something special to get hardware s3tc support for your driver. Check out your driver documentation specifically about it.

Also, bring up the ram usage for VS and post that . if it's showing up as taking more than 600MB of ram, then it's just using software fallback to transparently handle your hardware's non-support. It may manifest in Xorg ram usage, so check it out before and after starting VS.

i dont think this will address the performance when viewing a planet or such (i think that's a separate problem) but it should make the game definitely more playable in other situations.
Ed Sweetman endorses this message.
Miramor
ISO Party Member
ISO Party Member
Posts: 410
Joined: Tue Jun 26, 2007 7:15 pm

Post by Miramor »

Umm... again... How do I disable vsync? I have absolutely no idea what it is.

Re the library being a software fallback, I'll see what I can find...
Miramor
ISO Party Member
ISO Party Member
Posts: 410
Joined: Tue Jun 26, 2007 7:15 pm

Post by Miramor »

Alright... I just started VS in a window, got a bounty mission, chased a Nicander through a jump point, got my ass kicked thanks to a Clydesdale slowing the game down, and landed on a planet for repairs. Throughout that, Vega Strike's resident memory never exceeded 305 MB. Looks like the library you linked to works.

As for vsync.. Okay, this is forcing the graphics card to use the same vrefresh rate as the monitor, isn't it? Do you have any idea how to disable that? Because I'm searching and I'm not finding anything, only crap about Compiz.

Edit: I see a xorg.conf option for fglrx. I'm using Intel crap. Does 945GM even have such a feature?
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

does the framerate ever to above 75? if so then vsync is likely off.

ps. do you not have the 1GB ram option selected in vssetup? Getting to the main menu uses almost 300MB of ram for me. I dont think even with the 256MB option set that i used less than 500MB getting into a system.
Ed Sweetman endorses this message.
Miramor
ISO Party Member
ISO Party Member
Posts: 410
Joined: Tue Jun 26, 2007 7:15 pm

Post by Miramor »

No, framerate never goes above 50.

(My monitor's refresh rate is 60 Hz, but it's an LCD (this is a laptop). Come to think of it... That's for the whole monitor isn't it? What would be the point of having vsync enabled with an LCD - wouldn't tearing be impossible?)

Re the RAM, I use 512 MB + 1 GB swap (or 512 + 256 MB swap, it doesn't seem to make a difference). Also, keep in mind that I was running in Software Retro mode, so max_texture_dimension and max_cubemap_size are very small.
Miramor
ISO Party Member
ISO Party Member
Posts: 410
Joined: Tue Jun 26, 2007 7:15 pm

Post by Miramor »

Okay, I applied Breakable's patch and recompiled.

Performance around planets, jump points, and other large objects is so much better it's not even funny - there is no framerate drop whatsoever. The wormhole animation is smoother as well. Additionally, lines that were blurred (as if by antialiasing) before, such as those in jump ponts, now look clean - if anything the game looks better.

So yes, it's pretty clear that the chief problem was VS failing to obey smooth_lines/smooth_points="false" in a lot of cases. This patch, or something similar, should probably be applied to the SVN, so that minimal settings are actually minimal:

link to patch

Vega Strike still slows down significantly when large numbers (i.e. a dozen or more) of ships are in view, regardless of range, but that looks like a different issue - the main performance problem seems to have been solved.
Breakable
Hunter
Hunter
Posts: 95
Joined: Sun Nov 18, 2007 1:19 pm

Post by Breakable »

I thought the patch was in svn already.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

Ace said he committed it. I guess i missed it as i havn't rolled it into my branch yet. Will do that today.

I cant view the patch since i'm at work. If nobody beats me to it, i'll look at it when i get home tonight and commit it. I'm guessing i know already what i'll see. We make all kinds of assumptions throughout gfx/ that ignore config variables and even ignore settings from our own api's. like ignoring the mipmapped settings in ani_texture.



btw. Vsync slows down the loading of textures, not just the rendering on the screen. I'm not sure what aspect of our texture loading process causes this. When i have vsync enabled, loading VS takes like 40+ seconds, with vsync disabled, I get vs loaded in less than 20 seconds from commmand line to launch from atlantis. (loading a saved game).
Ed Sweetman endorses this message.
Miramor
ISO Party Member
ISO Party Member
Posts: 410
Joined: Tue Jun 26, 2007 7:15 pm

Post by Miramor »

Okay, I don't *think* vsync is enabled.

Code: Select all

cat /var/log/Xorg.0.log | grep -i vsync
(II) I810(0): Modeline "1280x800"x0.0   71.00  1280 1328 1360 1440  800 803 809 823 -hsync -vsync (49.3 kHz)
Unless that's a different vsync.

Re the patch... Yeah, it was supposedly committed, but it's pretty clear that smooth_lines="false" doesn't work right without it in the current SVN... Maybe the current situation should be considered a regression.
safemode
Developer
Developer
Posts: 2150
Joined: Mon Apr 23, 2007 1:17 am
Location: Pennsylvania
Contact:

Post by safemode »

it works, it's just not used everywhere that we call the function it's supposed to control. over time it seems like people forgot we had config vars for certain things, no idea how that could happen *caugh* no central defaults list built in to vs_config *caugh*
Ed Sweetman endorses this message.
Post Reply