The Way Forward

Development directions, tasks, and features being actively implemented or pursued by the development team.

The Way Forward

Postby pheonixstorm » Thu Jan 28, 2010 12:34 am

Ok, so i'm posting waaaay too much but at least most of it makes sense. Anyway, heres the next minor easy to ignore rant.

Pick a job so we can get .5.1 on track!

I think myself and jason is it? Are working on getting the windows binary to work (though each in diff ways) so that much is taken care of.

chuck is still working on his shaders.. not sure what safemode is working on and turbo has kept blasting away at new audio (but can you code?) Once you are done with the shaders can you go back and check all the current models to make sure they are up to snuff? You dont have to rework them unless they need it if you would rather get the shaders and cubemaps up to par.

Heres the highest priority not covered.

1. FIXING THE SVN - This has got to be a high priority. Who has admin rights on this that can pretty much help us start the svn from scratch more or less? since 0.5.0 is fairly stable should we get rid of everything before this point?

2. Flesh out the current VS universe. I dont know how much back story is done but am still trying to take it all in. Are all the races/faction done? What about all the different ship types (not the models just the basics). How many systems will the game eventually have? Are they all created and the planets fleshed out?

3. Stop adding new code and fix the current dev branch. Who is doing the coding? I think its time to get 0.5.1 into a more stable position so we can stop and rethink everything.

4. Review the whole project and see where we should go from here. I know 0.6 was in the works but work stopped for whatever reason. Once a stable .5.1 is out we all need to take a long look and see where things should go. File formats, engine tweaks/rewrites, etc.

Just a few of my own ideas.. anyone have anything to add, debate, or shake a stick at?
Because of YOU Arbiter, MY kids? can't get enough gas. OR NIPPLE! How does that mkae you feeeel? ~ Halo
User avatar
pheonixstorm
Elite
Elite
 
Posts: 1567
Topics: 113
Joined: Mon Jan 25, 2010 7:03 pm

Share On:

Share on Facebook Facebook Share on Twitter Twitter Share on Digg Digg

Re: The Way Forward

Postby safemode » Thu Jan 28, 2010 8:34 am

pheonixstorm wrote:1. FIXING THE SVN - This has got to be a high priority. Who has admin rights on this that can pretty much help us start the svn from scratch more or less? since 0.5.0 is fairly stable should we get rid of everything before this point?


Not going to happen. If svn is confusing it's because you're new to it. The whole point of svn is to keep track of all the revisions to all the files from start to now. There are about 10 years or so of such changes with VS. Losing the history to the files is not acceptable.

Also, one of the reasons some directories dont get deleted from head after they are no longer used is because in SVN, deleting doesn't save any space on the server. It just removes it from the current version onward when you do a pull. You dont have to pull that stuff when you grab the code that is being used to develop on. Ie, nobody should be pulling the entire trunk. The wiki and such howto's define the currently active repos and their purpose. If /trunk was populated by a million directories, this should not bother or matter to anyone using svn as the ones that they need are listed on the website.

The other reason such directories aren't deleted after no longer being of use is because someone may take control of that particular directory and make use of it later (if possible) and deleting it from HEAD means the only way they'd know that code even existed is if they knew about it before it was deleted.

Basically, it being there is a 0 cost issue. Deleting it has some significant negatives, and only a superficial positive.


2. Flesh out the current VS universe. I dont know how much back story is done but am still trying to take it all in. Are all the races/faction done? What about all the different ship types (not the models just the basics). How many systems will the game eventually have? Are they all created and the planets fleshed out?

check out the authoring forum, if it's visible to you.

3. Stop adding new code and fix the current dev branch. Who is doing the coding? I think its time to get 0.5.1 into a more stable position so we can stop and rethink everything.


we're way past 0.5.1. Next release will likely be 0.6.0. The main reason interim releases weren't made was that the only people who made win32 releases are forever busy and the /data is too huge to roll up in it's entirety every month. We dont yet have an incremental patch system in place. Get this working, and you fix your release schedule, and we stop having all these issues related to people playing in SVN that shouldn't be, as well as feature creep.

4. Review the whole project and see where we should go from here. I know 0.6 was in the works but work stopped for whatever reason. Once a stable .5.1 is out we all need to take a long look and see where things should go. File formats, engine tweaks/rewrites, etc.


The reason why 0.5.1 wont work is that currently HEAD is completely incompatible with 0.5.0. user directories should be whiped when moving to the current head, /data needs to be completely replaced. Keeping the same major revision would be misleading. A whole lot of stuff has been done since 0.5.0, and while we dont match the roadmap outlined for 0.6.0, the cubemaps and shader work should mark the start of a new major revision.



I think you'll find that organizing and planning helps a little, and it's been done recently and should continue to be done, but there are only a handful (literally i think probably less than 5) people who actively work on VS to any significant degree. Unfortunately, most are no longer in school, so time constraints cause most of these plans to mutate and deteriorate almost instantly, and there isn't anyone to take up the slack or help out.

Any future plans specific to releases once the next one is made should really focus on 1 or 2 things being a _requirement_ of that particular release, any other changes would be allowed if it didn't hurt anything. That second part is important because you can't force the few people who are working on VS to follow a strict set of what is allowed to be worked on. We take what we can get when we can get it because we really have no choice. Need more college kids without jobs and a life :)

I think the single most important thing to get hammered out and actually implemented for the next release and for VS to actually move forward at any significant pace is an incremental release system that works the same across all platforms and exists as a singular solution on all platforms. Without that you'll just have the same problem every release that VS has had for the last 5 years.
Ed Sweetman endorses this message.
safemode
Developer
Developer
 
Posts: 2150
Topics: 84
Joined: Sun Apr 22, 2007 6:17 pm
Location: Pennsylvania

Re: The Way Forward

Postby pheonixstorm » Thu Jan 28, 2010 1:07 pm

I see the reasons on what to do for svn.. I just cant stand the clutter under tags, which is also where most of the stable branches seem to be hiding. Something which the wiki or getfiles page does not point to.. just the binaries and a linux source tarball on sf. How many armchair coders (or modelers for that matter) would know where to get the last stable build? I try to think like a noob so I can see what may or may not work. If someone is new to svn they may view the svn off the sf website or use browse repo if on windows using tortoisesvn

Heres another question though.. are any of the project admins listed in sourceforge still active around here?
Because of YOU Arbiter, MY kids? can't get enough gas. OR NIPPLE! How does that mkae you feeeel? ~ Halo
User avatar
pheonixstorm
Elite
Elite
 
Posts: 1567
Topics: 113
Joined: Mon Jan 25, 2010 7:03 pm

Re: The Way Forward

Postby safemode » Thu Jan 28, 2010 2:01 pm

pheonixstorm wrote:I see the reasons on what to do for svn.. I just cant stand the clutter under tags, which is also where most of the stable branches seem to be hiding. Something which the wiki or getfiles page does not point to.. just the binaries and a linux source tarball on sf. How many armchair coders (or modelers for that matter) would know where to get the last stable build?

You aren't really supposed to blindly browse into an SVN repo and expect to find stuff. Nothing is ever deleted in SVN, so there is just an information overload that makes this type of searching very inefficient.

A better solution would be to normalize release and revision tags apart from every other tag, so that we can say in a wiki, the release tag (anything we pkg up, be it major release or minor) can be retrieved from svn by "name convention" and we can be certain that such a tag exists, without having to manually add each tag to the wiki...etc.

but the /tags repo is the way it is because of the nature of what a tag is. It's simply a marker, not in any way different from a branch as far as svn is concerned. Tags are meant to mark milestones, and while currently, many of those milestones may seem pointless to keep around, svn doesn't delete anything, and isn't meant to be browsed... the existence or non-existence of tags are only meaningful to the people who know about them existing, and can be safely ignored by those that dont.

in essence, you dont need to pull /tags, so dont worry about how much stuff is in it. Only worry that the stuff you do want to pull from it is in it. that's where proper naming convention for releases helps. 0.5.0 is an exception to this naming convention. Not sure why it wasn't properly tagged like every other release of VS was. That is a mistake.

I try to think like a noob so I can see what may or may not work. If someone is new to svn they may view the svn off the sf website or use browse repo if on windows using tortoisesvn


noobs really shouldn't be messing with SVN. And if they do, they should be retrieving something specific. There's no need to explore around, not that there is any danger in it,..it's just a waste of their time.

The real use for the website portal to svn is remote access to the changesets. I use it often for it's online diff ability between revisions while i'm at work to trouble shoot code people have issues with while i'm away from my computer.. It's a tool you use if you know what you're looking for. You're gonna get nowhere fast if you dont have a destination with svn. Especially if that person doesn't have a full grasp on how svn deals with data. But once you got SVN under control, the web interface of browsing is quite useful, but only for a few things.

Heres another question though.. are any of the project admins listed in sourceforge still active around here?


short answer, no. Long answer, they peek in and out every now and then. Right now they're fairly busy. I would think that once we were ready to roll out a release, that would get their attention for a bit, as that is an extremely tedius process that involves rolling up a distributable binary, updating documentations, website, etc etc. A lot of activity. We've been very quiet this past year.

edit: i forgot to mention, rolling up the release means doing it for linux, mac and windows. Though, we usually let distros handle the linux bins.
Ed Sweetman endorses this message.
safemode
Developer
Developer
 
Posts: 2150
Topics: 84
Joined: Sun Apr 22, 2007 6:17 pm
Location: Pennsylvania

Re: The Way Forward

Postby charlieg » Thu Jan 28, 2010 2:32 pm

I think this is another broken methodology for releases.

Rolling a release should mean one thing only: the source release. The binaries will come. The release will attract people determined to build binaries for their platform. If you release 0.6.0, word spreads, and Windows users will come along and - many disappointed - some will pitch in. When Windows finally builds, then you either put out another announcement (0.6.0 now for Windows!) or, if there are updates (even relevantly insignificant ones) roll out 0.6.1 which you can now simultaneously release for Windows.

Instead, we have Windows users coming here, getting lost, and going away again. They don't know what's going on whatsoever.
Free Gamer - free software games compendium and commentary!
FreeGameDev forum - open source game development community
User avatar
charlieg
Elite Mercenary
Elite Mercenary
 
Posts: 1328
Topics: 56
Joined: Thu Mar 27, 2003 4:51 pm
Location: Manchester, UK

Re: The Way Forward

Postby safemode » Thu Jan 28, 2010 3:08 pm

While i dont oppose splitting arch releases in that manner, not requiring patches to get it to build on those archs for a release is a requirement.

It's one thing to say, we have 0.6.0, just nobody to build it on windows and wait for someone to build it and then post the bins. it's another to say that windows needs like half a dozen patches to 0.6.0 to build. That makes the build that the windows bin is based off of a different codebase than the other bins that may have been built prior to those windows patches. That's a discrepancy that would just be unacceptable.

It's also important that the bins distributed on the website be built by trusted people. While the likelihood that someone would be malicious, it would be horribly damaging if we did just accept bins from wherever we can get them for a particular arch. It would be impossible to verify, so we have to rely on the honor system based on history with that person as a developer/contributor. As such, one would have to wonder why such a member would not be able to make their bins by the release date the others are done on.

I'm know it sounds like i'm being difficult or contrary, but if it ever came down to "we're ready to release, but we have nobody to build for mac or windows" we would make a release. The real holdup to making a release is /data. Everyone agrees that rolling it up completely for every release is not acceptable. A solution however, hasn't been on anyone's front burner till very recently.
Ed Sweetman endorses this message.
safemode
Developer
Developer
 
Posts: 2150
Topics: 84
Joined: Sun Apr 22, 2007 6:17 pm
Location: Pennsylvania

Re: The Way Forward

Postby charlieg » Thu Jan 28, 2010 6:04 pm

My advice (/opinion) is that being too strict with versions, release requirements etc, it only protracts the development path and causes frustration. Release early, release often. Having stable/unstable releases (you should play X, but here is Y for the daring) is really healthy as it 1) provides evidence of activity 2) gets communities chatting 3) gets non-developers more involved in the testing/dev process and 4) fosters bigger communities because of 1+2+3. Releases don't need to be particularly stable or complete, can omit certain platforms that are a WIP and include WIP material - all this is OK as long as you attach the caveat and invite people to help out; always have a stable 'this is for the casual player' version as a fallback.

Perfect is the enemy of good. There's nothing wrong with delaying things (again, less strict, more agile) to get things right - but identifying what the 'right' is, that's what is critical to really breathing life into an open source project like Vega Strike. In my opinion (based on my blogging + bizarrely zealous interest in open source gaming) 'right' is getting the next release out of the door ASAP. Make an unstable release (0.5.9? 0.6.beta?) and don't worry too much about making things perfect. Hold off of 0.6.0 if you can't yet build for Windows, by all means, but don't not release.
Free Gamer - free software games compendium and commentary!
FreeGameDev forum - open source game development community
User avatar
charlieg
Elite Mercenary
Elite Mercenary
 
Posts: 1328
Topics: 56
Joined: Thu Mar 27, 2003 4:51 pm
Location: Manchester, UK

Re: The Way Forward

Postby Turbo » Thu Jan 28, 2010 6:07 pm

pheonixstorm wrote:turbo has kept blasting away at new audio (but can you code?)

I have a BS in computer science so I can code, but that degree was awarded in 1991 and the languages I know aren't used here. Nor have I had to code anything more complex than XLM and SQL since then. So even if I wanted to code, you'd be waiting a long time before I could start.

The project needs work in a dozen disciplines, some of which you mention in your other thread. Coding is only involved in a few of them and even if it were a good idea to make everyone a coder, it won't happen.

Or did you ask the question as a way to offer your services in coding?
Turbo

There are two speeds in combat: stopped, and as fast as you can go. Unless you run into something, going fast keeps you alive more often than stopping.
User avatar
Turbo
ISO Party Member
ISO Party Member
 
Posts: 423
Topics: 20
Joined: Mon Jun 11, 2007 4:54 am
Location: Korea (again)

Re: The Way Forward

Postby pheonixstorm » Fri Jan 29, 2010 1:14 am

charlie, releasing often would be nice, and for linux it can be done now w/o any problems.. the real issue is how much coding has been done since .5 was released and there hasnt been a new win binary. This could very easily cause the rest of the dev code to become crippled. Releasing often is good, but releasing for only one target platform while you don't even know whats broken on the others can lead to disaster.

turbo, I was curious is you only did audio work or if you could code as well. For myself, I can code to a limited extent. Enough to work on a few dev tools that will work for Privateer PU and hopefully VS too. May make working on the back end much easier for everyone. What languages did you learn? I was actually thinking that if your talenting were like those of chuck_starchaser then you could work on some of the audio code to include a few features people have been asking for. That was what I was hoping for, a lot of audio and a little coding.

I'm still trying to see who can do what around here. Seems safemode does the majority of the coding right now with a little help from chuck, not sure about the talents of charlie and jason other than being full of helpful opinions or trying to fix a busted win32 build. Will learn as I go anyway.
Because of YOU Arbiter, MY kids? can't get enough gas. OR NIPPLE! How does that mkae you feeeel? ~ Halo
User avatar
pheonixstorm
Elite
Elite
 
Posts: 1567
Topics: 113
Joined: Mon Jan 25, 2010 7:03 pm

Re: The Way Forward

Postby charlieg » Fri Jan 29, 2010 7:27 am

Be specific phoenixstorm, how can it lead to disaster? So the new code targets a single platform - are you going to just throw that code away? No. So release it then. It might need modifying to bring it in line with other platforms, but since VS uses cross platform technologies (OpenGL etc) that's usually nothing too serious. Everything you said reasoning about not releasing is (IMO) baseless. (No offense meant :))

I agree with safemode in that an update tool (something already discussed) is a big part of the release process but I stand by my advice that you should wrap up an unstable release immediately. This weekend. Why not? It's "unstable" so those who need to conserve bandwidth won't go there.

Btw, the sf.net project page lists 0.4.3 as stable. That might be a bit confusing.

http://sourceforge.net/projects/vegastrike
Free Gamer - free software games compendium and commentary!
FreeGameDev forum - open source game development community
User avatar
charlieg
Elite Mercenary
Elite Mercenary
 
Posts: 1328
Topics: 56
Joined: Thu Mar 27, 2003 4:51 pm
Location: Manchester, UK

Re: The Way Forward

Postby safemode » Fri Jan 29, 2010 8:53 am

The only reason I say that the update process tool is a necessary requirement for the next release is that i'd like to avoid having to make _another_ full package of /data like a month or so after what would be the current release once the tool is ready.

The next release is going to be different from all the ones that have come before. Vegastrike bins are going to be separate downloads from /data, installed separately and need some way of verifying the version of eachother so that unscrupulous users dont mixmatch the engine from the data.

That much is a given regardless of the updater.

If we can hammer out an api for versioning (a standardized way of identifying acceptable versions that are compatible with eachother) then we could avoid having to repackage the entire /data when the updater tool is ready, and just have the updater tool be part of the bin package and start our incremental pkgs for /data. Like, 0.6.0.N where any N is compatible with any other N but if a 0.6.1.N release is made and the user downloads the bins to that, then they have to be running a /data thats at least 0.6.1. We then have to make sure we increment revisions of /data and vegastrike in a way that denotes what changes cause incompatibility and what ones dont. Either that or we enforce a strict 1 to 1 compatibility, or some explicit list.

Anyway, if we can do that, we can avoid having to have the updater ready now, and still make use of the /data pkg we release so that the next release is an incremental /data pkg.
Ed Sweetman endorses this message.
safemode
Developer
Developer
 
Posts: 2150
Topics: 84
Joined: Sun Apr 22, 2007 6:17 pm
Location: Pennsylvania

Re: The Way Forward

Postby pheonixstorm » Fri Jan 29, 2010 1:37 pm

charlieg wrote:Be specific phoenixstorm, how can it lead to disaster? So the new code targets a single platform - are you going to just throw that code away? No. So release it then. It might need modifying to bring it in line with other platforms, but since VS uses cross platform technologies (OpenGL etc) that's usually nothing too serious. Everything you said reasoning about not releasing is (IMO) baseless. (No offense meant :))


None taken.

Ok, let me try to get this out better.

up to rev 1200 of game x works under win32/*nix/mac
last mac/win32 release works
1201 somehow breaks win32 and mac but not *nix (its a 25 line fix)
codeing continues for *nix up to revision 1285 before anyone bothers to check a win32/mac build (linux builds every 10-15 revs)
bug hunt for win32/mac
rev 1340 for linux before you find the problem in win32/mac
the fix requires a mjor rewrite of a new graphics subsection of the code totaling 686 lines of code

OpenGL, OpenAL, zlib, and other cross platform libraries remain unchanged, they play no part in broken base code.

Nothing in the game engine should be platform specific, thats why we have the cross-platform libraries. This is why you want to release on all targets when possible not just one. What would you rather fix. 25 lines of code or several hundred?

The only way to get around this is to have a source branch for each target platform.. but if you do this why make it cross-platform to begin with?
Because of YOU Arbiter, MY kids? can't get enough gas. OR NIPPLE! How does that mkae you feeeel? ~ Halo
User avatar
pheonixstorm
Elite
Elite
 
Posts: 1567
Topics: 113
Joined: Mon Jan 25, 2010 7:03 pm

Re: The Way Forward

Postby pheonixstorm » Fri Jan 29, 2010 2:48 pm

safemode wrote:The only reason I say that the update process tool is a necessary requirement for the next release is that i'd like to avoid having to make _another_ full package of /data like a month or so after what would be the current release once the tool is ready.


where will the incremental updates be?

The next release is going to be different from all the ones that have come before. Vegastrike bins are going to be separate downloads from /data, installed separately and need some way of verifying the version of eachother so that unscrupulous users dont mixmatch the engine from the data.


Do you have a new data structure in mind for sperating different data builds? Other than pulling matching files from svn?

If we can hammer out an api for versioning (a standardized way of identifying acceptable versions that are compatible with eachother) then we could avoid having to repackage the entire /data when the updater tool is ready, and just have the updater tool be part of the bin package and start our incremental pkgs for /data. Like, 0.6.0.N where any N is compatible with any other N but if a 0.6.1.N release is made and the user downloads the bins to that, then they have to be running a /data thats at least 0.6.1. We then have to make sure we increment revisions of /data and vegastrike in a way that denotes what changes cause incompatibility and what ones dont. Either that or we enforce a strict 1 to 1 compatibility, or some explicit list.

Anyway, if we can do that, we can avoid having to have the updater ready now, and still make use of the /data pkg we release so that the next release is an incremental /data pkg.


Exactly what files/directories do you think will change enough between one minor revision to the next? We could always use somehting like the world of warcraft updater. Check for a new build when the launcher loads and download patch files then build them into the main data files.

Say we have files based on each directory ai.bin or some such. We download ai.patch which contains an update to ai.bin bringing it to .6.1 build 16 or whatever. The updater runs using zlib to open ai.patch and add the new files into ai.bin. Are you looking for something like this or just updating seperat files in the current directory structure?

To me it seems to be easier to make each folder or each set of related files one seperate complete with version info
ai.bin contains
Code: Select all
ai.bin ->
-> version.txt (ver=061b1) 0.6.1 build 1
-> VegaEvents.csv
-> VegaPriorities.csv
-> /script
->-> heatseek.xai
->-> next file in the list....


on game start zlib handles opening the folders so we can load all the files. How much overhead would this create? compressed or uncompressed?

I get what you are looking for safemode but im not sure how you want to attempt it overall, is there a main thread this has been dicussed in?
Because of YOU Arbiter, MY kids? can't get enough gas. OR NIPPLE! How does that mkae you feeeel? ~ Halo
User avatar
pheonixstorm
Elite
Elite
 
Posts: 1567
Topics: 113
Joined: Mon Jan 25, 2010 7:03 pm

Re: The Way Forward

Postby Turbo » Fri Jan 29, 2010 8:04 pm

pheonixstorm wrote: I was actually thinking that ... you could work on some of the audio code to include a few features people have been asking for. That was what I was hoping for, a lot of audio and a little coding.

The only coding I'm doing is the XML that makes the game play the right audio when the AI pilots talk to the player. I like being on the content side, because the coders can change game engines or coding languages all they want and the content we made will still be useful.
Turbo

There are two speeds in combat: stopped, and as fast as you can go. Unless you run into something, going fast keeps you alive more often than stopping.
User avatar
Turbo
ISO Party Member
ISO Party Member
 
Posts: 423
Topics: 20
Joined: Mon Jun 11, 2007 4:54 am
Location: Korea (again)

Re: The Way Forward

Postby chuck_starchaser » Fri Jan 29, 2010 8:58 pm

Guys, Klauss has been working for over a year on a new audio system he's rewritten
from the ground up; and last I heard he was almost done.
User avatar
chuck_starchaser
Elite
Elite
 
Posts: 8014
Topics: 195
Joined: Thu Sep 04, 2003 9:03 pm
Location: Montreal

Re: The Way Forward

Postby pheonixstorm » Fri Jan 29, 2010 9:43 pm

That is really good news. is it more efficient that the current audio code?
Because of YOU Arbiter, MY kids? can't get enough gas. OR NIPPLE! How does that mkae you feeeel? ~ Halo
User avatar
pheonixstorm
Elite
Elite
 
Posts: 1567
Topics: 113
Joined: Mon Jan 25, 2010 7:03 pm

Re: The Way Forward

Postby chuck_starchaser » Fri Jan 29, 2010 10:11 pm

pheonixstorm wrote:That is really good news. is it more efficient that the current audio code?
Nothing could be less efficient than the current code; but yeah, I've been led to believe
it will be the Atomic Mother of all audio codes.
User avatar
chuck_starchaser
Elite
Elite
 
Posts: 8014
Topics: 195
Joined: Thu Sep 04, 2003 9:03 pm
Location: Montreal

Re: The Way Forward

Postby charlieg » Sat Jan 30, 2010 5:01 am

safemode wrote:The only reason I say that the update process tool is a necessary requirement for the next release is that i'd like to avoid having to make _another_ full package of /data like a month or so after what would be the current release once the tool is ready.

The next release is going to be different from all the ones that have come before. Vegastrike bins are going to be separate downloads from /data, installed separately and need some way of verifying the version of eachother so that unscrupulous users dont mixmatch the engine from the data.

You are attaching needless conditions to a new release.

Why not do a full unstable release. Nothing fancy, just 'we have lots of changes since 0.5.0, here is 0.5.9.0, an unstable, beta release of 0.6.0'. Why does it matter that it is a full release with /data and binaries together? Why does it need the updater? Sourceforge can comfortably handle the bandwidth. Those who choose to download an unstable release obviously have no bandwidth issues. What's the problem with it and any subsequent releases being 'full'?

I agree that an updater for 0.6.0 is desirable and can hold up that release. Don't not release just because you want some feature X that has not been started yet. You are just pushing the next release further into the future than it needs to be.

Release early, release often and stop worrying about things like size. Size is not that big an issue unless it massively consumes developer time. If you mark releases unstable, those with modems won't be wasting their time/bandwidth.

Remember:
- a release generates traffic, which brings people to the project
- a release makes the project look more active, compelling skilled contributors
- a release makes existing contributors happy, their contributions become more accessible to players

An unstable release does not need an updater, does not need to be complete or stable, but just needs to showcase new features. There are plenty of the latter. If the current SVN is playable (playable!=robust) and there are not glaring (and I mean really, really bad) problems with it, then it is releasable.


pheonixstorm wrote:Ok, let me try to get this out better.

up to rev 1200 of game x works under win32/*nix/mac
last mac/win32 release works
1201 somehow breaks win32 and mac but not *nix (its a 25 line fix)
codeing continues for *nix up to revision 1285 before anyone bothers to check a win32/mac build (linux builds every 10-15 revs)
bug hunt for win32/mac
rev 1340 for linux before you find the problem in win32/mac
the fix requires a mjor rewrite of a new graphics subsection of the code totaling 686 lines of code

OpenGL, OpenAL, zlib, and other cross platform libraries remain unchanged, they play no part in broken base code.

This is so contrived it is hardly worth rebutting. I have never encountered this in 9 years of following 100s of open source games. For you to use cross platform libraries and have written large amounts of non-cross-platform code is pretty tough. How can you use a cross platform graphics library (OpenGL) then write a graphics subsection that is so un-cross-platform that it requires a 'major rewrite' to work on other platforms? That just does not make much sense, and even if it did, the point still stands, that you would still gain from making a release even though it was only available to a subsection of your playing audience - the rest would just have to wait for the 'rewrite' and would be quite encouraged that there is activity even if it isn't yet available on their platform of choice.
Free Gamer - free software games compendium and commentary!
FreeGameDev forum - open source game development community
User avatar
charlieg
Elite Mercenary
Elite Mercenary
 
Posts: 1328
Topics: 56
Joined: Thu Mar 27, 2003 4:51 pm
Location: Manchester, UK

Re: The Way Forward

Postby safemode » Sun Jan 31, 2010 3:12 am

pheonixstorm wrote:
safemode wrote:The only reason I say that the update process tool is a necessary requirement for the next release is that i'd like to avoid having to make _another_ full package of /data like a month or so after what would be the current release once the tool is ready.


where will the incremental updates be?


Incremental updates will be in the files section on sf.net ..and its mirrors. This should not cause a problem, we would have one full copy of /data done at each major release and incremental patches done each important revision.
The next release is going to be different from all the ones that have come before. Vegastrike bins are going to be separate downloads from /data, installed separately and need some way of verifying the version of eachother so that unscrupulous users dont mixmatch the engine from the data.


Do you have a new data structure in mind for sperating different data builds? Other than pulling matching files from svn?


An incremental patch would likely consist of a bzip'd tarball (or whatever compression we can easily have python support out of the box) and a header file that explains to the updater what files to copy to where and what files to delete. That's all it _needs_ to do. If we want to add features later, we can.

A patch only increments from the preceding revision to the current one it was made for. To move from one version to a non-adjacent version, you'd need to pull all the increments and install them in order. (updater should handle that). Reversing an update should also be supported rather early in the new updater, this can be done by backing up deleted files (if the user requests). Though, that's not needed at first.

If we can hammer out an api for versioning (a standardized way of identifying acceptable versions that are compatible with eachother) then we could avoid having to repackage the entire /data when the updater tool is ready, and just have the updater tool be part of the bin package and start our incremental pkgs for /data. Like, 0.6.0.N where any N is compatible with any other N but if a 0.6.1.N release is made and the user downloads the bins to that, then they have to be running a /data thats at least 0.6.1. We then have to make sure we increment revisions of /data and vegastrike in a way that denotes what changes cause incompatibility and what ones dont. Either that or we enforce a strict 1 to 1 compatibility, or some explicit list.

Anyway, if we can do that, we can avoid having to have the updater ready now, and still make use of the /data pkg we release so that the next release is an incremental /data pkg.


Exactly what files/directories do you think will change enough between one minor revision to the next? We could always use somehting like the world of warcraft updater. Check for a new build when the launcher loads and download patch files then build them into the main data files.

Say we have files based on each directory ai.bin or some such. We download ai.patch which contains an update to ai.bin bringing it to .6.1 build 16 or whatever. The updater runs using zlib to open ai.patch and add the new files into ai.bin. Are you looking for something like this or just updating seperat files in the current directory structure?

To me it seems to be easier to make each folder or each set of related files one seperate complete with version info
ai.bin contains
Code: Select all
ai.bin ->
-> version.txt (ver=061b1) 0.6.1 build 1
-> VegaEvents.csv
-> VegaPriorities.csv
-> /script
->-> heatseek.xai
->-> next file in the list....


on game start zlib handles opening the folders so we can load all the files. How much overhead would this create? compressed or uncompressed?

I get what you are looking for safemode but im not sure how you want to attempt it overall, is there a main thread this has been dicussed in?


The patch would be a single file in a format best suited for use with python (as that is likely going to be a major part of the updater). Likely compressed. The updater then just unarchives the patch, which is nothing more than a bunch of files and a header file that tells it where to copy those files and what files to delete from /data and then what version to make it.

So you'd have something like :
data_0.6.0.2.update
data_0.6.0.3.update
data_0.6.0.4.update

the .update files would just be tarballs, compressed if possible and directly accessible via python without any additional apps. So you'd have:
Code: Select all
data_0.6.0.2.update ->
                                     control
                                     textures -> some textures to change
                                     sound -> some audio updates
control file would look like this:
                                                 <xml >
                                                            <delete >
                                                                  <file name="some file"\>
                                                                  <file name="another"\>
                                                                  <dir name="remove whole dir \>
                                                           </delete>
                                                           <move >
                                                                   <copy src="our patch path/file" dest="our data path/file" \>
                                                                   ...
                                                           </move>
                                                           <rename>
                                                                    <ren old="some current file" new="new filename" \>
                                                          </rename>
                                                          <version prev="0.6.0.1" cur="0.6.0.2" \>
                                                  </xml>


Rename is something i would like to see in there because it would drastically minimize the impact of massively renaming files if need be, or simply moving files around. This saves a lot of space when re-organizing but otherwise not changing anything.
Ed Sweetman endorses this message.
safemode
Developer
Developer
 
Posts: 2150
Topics: 84
Joined: Sun Apr 22, 2007 6:17 pm
Location: Pennsylvania



Return to Engine Development

Who is online

Users browsing this forum: No registered users and 2 guests

cron