Jump to content
XCOMUFO & Xenocide

Video Cards A Year From Now...


Robo Dojo 58

Recommended Posts

Yes, I know it's been discussed before, but with Xenocide being planned for somewhere next year, is it a good idea to limit to the Geforce2 limits?

 

There's a chart of video cards on this page, showing the frame rates you get with old video cards. The Geforce2 is in the middle, and the G2MX is way at the bottom. Remember, this is a year old.

Here is a chart chart of the latest video cards. The G2 isn't even on the list, and it's only been a year.

 

In a year, the G2 will barely be found on Ebay. When Xenocide is planned to be released, A Geforce 4 will become the bottomline videocard. Modern motherboards are already being equipped with builtin Radeon 9600's, for around $100. In a year, the cheapest cards available will be twice as powerful, maybe even more. Future boards will have at least builtin Geforce3 as the standard.

 

The newest cards also have awesome shader effects (think iridium sheen) and excellent bitmapping. (Halo's tires are actually perfectly round, and the hubcap is flat.)

 

I've attached a pic showing what I'm talking about. This pic is from Halo, on the X-BOX. The X-BOX has a builtin Geforce 3 chip.

 

So the bottom line is, do we really want to limit ourselves to the Geforce2 standard?

jeepfire.jpg

Link to comment
Share on other sites

Exactly, it's one thing to have all these wonderful effects, but you shouldn't add lots of coding just because you can-it slows things down or makes it not work on older gear. If the gf4 is the standard close to release, cool! Then we can make crash/terror sites that much bigger, and/or use the x-net textures in the battlescape too because the typical card has 256mb of memory, etc.
Link to comment
Share on other sites

in other words, we're limiting ourselfs to the GF2 until we actually need better. i mean, microprose's x-com1 used lower end 16mb graphics card and it still was cool. why should we need much better?
Link to comment
Share on other sites

I mean, microprose's x-com1 used lower end 16mb graphics card and it still was cool.

Actually, it works with a good ol' Trident with .5MB of memory. A 16MB graphics card is overkill, especially when the whole game isn't even 16 megs. Since the game came before 3d acceleration, any card with enough memory to do VGA works fine. :wink:

 

why should we need much better?

Well, I was thinking it'd be really useful mostly for DirectX 8/9 effects, like the iridium sheen, bitmapping various surfaces, and especially the lighting effects (chroming, shadows, etc.) . You wouldn't get many of those in DirectX7, which is what the Geforce2 supports. It isn't so much about more triangles/larger textures/bigger fields, as it is with having more visual effects to work with.

Link to comment
Share on other sites

Well, I was thinking it'd be really useful mostly for DirectX 8/9 effects, like the iridium sheen, bitmapping various surfaces, and especially the lighting effects (chroming, shadows, etc.) . You wouldn't get many of those in DirectX7, which is what the Geforce2 supports. It isn't so much about more triangles/larger textures/bigger fields, as it is with having more visual effects to work with.

I agree completely.

 

 

Anyway, if you really want to limit the game to a DirectX7 card you should choose a slightly better one like the Geforce2 GTS/Ultra or Geforce4 MX440. These cards have 64Mb video memory. This would alow using larger and sharper textures and really make a difference at least when zooming in the geoscape. I doubt there will be hardly anyone that doesn't have at least 64Mb on their gfx card in a year and if there is they can get one for hardly anything!

 

The Geforce2 MX cards are pathetic, even my old Geforce256 with 32Mb DDR is better.

Link to comment
Share on other sites

  • 1 month later...

DX7? 440MX? so old....

 

 

has anyone done a poll on what type of vic card everyone has? that might give is some kind of consensus.... should be something what you are shooting for... personally i only have a 5600 ultra with 256mb ram, but i was eyeing that new 5950 card for 380 bucks....but this time next year, that baby will be about 100 bucks or cheaper... something better will be out... geforce 20000 with 1gig ram, i know we dont want the reqs to be super high, since we will want to provide this to a wide base of players but by that time i would say the standard will be 5600 range or higher... i really dont know much about programming other than the little from hs, so its just a thought, also how hard would it be to design it so that you can chose your settings at the start like in some games, as in it will auto detect your vid card and it will set the settings to the most playable preference... so that it will give everyone a chance even if you want to play it on a lap top or something..

Link to comment
Share on other sites

Newer video cards aren't _that_ much faster than older ones (despite what the manufacturers say). The main difference is memory and some other speedup stuff like transform and lighting, layered textures and on-card firmware stuff for doing pixel shaders. All this stuff can be optional

 

We surprised ourselves on Thursday by pursuading our game to run on a celeron 600 with a basic GeForce 3. all the options were on the floor mind :D .

Link to comment
Share on other sites

You guys, honestly :) Oh its only fx or only a 9800. Come on smell the coffee.

 

I have in my second machine the perfect chance to evaluate the performances of various cards.

 

In my main machine I'm running a Ti4280 (from reading what you guys are saying its like a stone age card) In my second machine I run a gforce 256. Yep that's right the original 256, before that I ran a Voodoo 3 in it.

 

Bear in mind that my main machine is a 3.06ghz p4 witha gig of dual channel ddr and the second machine is an AMD 1200 with 512mb, there isn't a lot of difference between them. Sure the 4200 runs quicker and at higher resolutions but the other machine handles everything I've thrown at it so far with out a hitch. Including No one lives forever 2, Halo and Warcraft 3.

 

This has led me to question why (other than lining the pockets of the gfx companies) do we 'need' such high spec cutting edge cards. More importantly why do we need them as a baseline in xenocide.

 

I think sometimes the marketing people have won our (collective) souls. I've seen it all over the hardware review forums and pretty much everywhere. "OMG I need the uber spank me card cause Half life 2 won't run on it." and "Ha you losers with the only the mega spank me cards are never gonna run it, our card has 10 more fps than yours"

 

Its like a frenzy of little boys shouting my dad's bigger than yours. There are only a handful of games that push the envelope as far as specs are concerned and not really any that need the kind of power the review sites claim.

 

Xcom ran on 386's and 486's with 20mhz and 4mb of ram. I don't think the specs for the game needed to be that high as it ran on the amiga as well which was (and is) a motorola 68000 at 6mhz with 1mb of ram.

 

I really think we should be aiming for the same kind of market penetration as the original enjoyed, which means making it available to as many people as possible.

 

The Gf2 will handle (with a quick enough proc) will handle any current game. Heck even Halo runs perfectly well on 'only' a GF3.

 

Seeing as xenocide isn't going to need the kind of resources that say UT2004 or Half life 2 would need we really need not worry about 'upping' the baseline specs.

 

You have to ask, what's more important. Making sure the game runs ok for the majority of people (600mhz and GF2 will reach a heck of a lot more players) or keeping it only for the elite 'enthusiast' market who spend much more on their computers than is healthy (I include myself in this group ;)).

 

Personally I'd prefer to see the game be available to the widest audience, after all that's why we're making it isn't it?

 

I'm sure RK has said it before but we'll only 'up' the requirements when we need the game to do something our specs can't handle.

 

Rant over ;)

Edited by Deimos
Link to comment
Share on other sites

yes, i understand all your views...

 

all i was trying to say is that with the better cards, maybe there can be an option for us to have better graphics..i.e., better lighting effects, pixel shading, all that other good stuuf that comes with the newer cards.. just a thought, i didnt say that it shouldnt be able to run on old machines just that maybe there can be some benefits of playing it with a newer card...again just a thought...

Link to comment
Share on other sites

:) Sorry about the rant. I get carried away sometimes.

 

I agree totally with you, As usual I didn't put it across propperly in my post. I agree we should have 'flashy effects' available for those that want it and as an option but I guess our primary goal is to get the game out and working before we add all the sfx to it :)

Link to comment
Share on other sites

ONLY 80 trillion? That still can't emulate the entire world down to the atom! :P Actually, more detail comes from textures, bumpmapping, and lighting, than actual triangles nowadays. That's what newer cards are optimising in.

 

The FX 5200(D9) is actually weaker than the G4Ti(D8) series. It's main benefits come from supporting the newest DirectX 9 effects.The Geforce 2 series is DirectX7, IIRC, which may not support the cooler effects. Of course, I guess that's what you mean by upping the requirements as needed.

Link to comment
Share on other sites

  • 2 weeks later...
Great discussion. I made the mistake a few years back with getting a GeForce2MX, and it burnt out on me within two months of having it. I wasn't happy at all. I think having a GeForce2 compliant engine isn't a bad idea. It seems to rest right into the medium of reliability with the consensus capabilities of modern computers. It's not too much and it's not too little.
Link to comment
Share on other sites

Great discussion. I made the mistake a few years back with getting a GeForce2MX, and it burnt out on me within two months of having it. I wasn't happy at all.

Mistake? I still have such card and it is working fine for 2 years. And it even does not have a cooler on it! My friend (a hardware pro, which consults me on any hardware issues), literally topped over when he heard that card does not have cooler, and said to replace it immediately. That was 3 months ago and card is still all right! :wink:

Link to comment
Share on other sites

Ah man, I envy you dearly. Ever since that card burned out on me I haven't bothered to get a new one. I just slipped in an older one I had from an older machine. All I know is it's AGP quality, but not nearly as good as the 2MX I had. Wow, it's bizarre yours is still working. I know nVidia doesn't even sell them anymore. Lmao. :LOL:
Link to comment
Share on other sites

I bought a used one from eBay - it did not have any manual, and I did not thought about cooler myself. :) It was around $40 then, but I bet you can get one much cheaper now.

I am not upgrading my old machine, cause I need a second one. I just put a new MB/CPU/Memory in my old, and I hope that is the last upgrade it will ever get for next few years.

All that new and fancy stuff is so expensive, but the real question is that so necessary? Do I really need 3GHz CPU? Or FX9000 video? :)

Link to comment
Share on other sites

:) RAM is cheap as Coke now. :)

2GB is an overkill, unless you are launching 25 applications at once or ran several virtuall OSes...  -_-

Heh, yeah right. Tell that to my system. Dual channel DDR pc400 isn't cheap. And buffered ECC and rambus aren't either :whatwhat:

 

2Gb would be just about right for me. I feel my system could use the extra gig when rendering and opening up my larger art files.

 

But then again I could easily justify a dual xeon or dual opteron workstation with a Quadro card after rendering my latest scenes :)

 

Taking the subject a little further ot has anyone seen the new dual opteron boards, mmmboy 6 ram slots which gives dual dual (no its not a typo) channel memory. Feel the speed :happybanana:

Link to comment
Share on other sites

:blink:

i like computers, but not to such extent! :)

i would rather buy a Z4 for that money. :)

If you can tell me where to get a z4 for £2500 I'll snap your hand off and buy 4 ;)

That is I assume you're talking about the BMW Z4 :blink:

Edited by Deimos
Link to comment
Share on other sites

But then again I could easily justify a dual xeon or dual opteron workstation with a Quadro card after rendering my latest scenes

 

Taking the subject a little further ot has anyone seen the new dual opteron boards, mmmboy 6 ram slots which gives dual dual (no its not a typo) channel memory. Feel the speed

Uh, oh, did someone mention getting a new computer? :Blush:

 

Here's a very in depth article on the Athlon Opteron: http://www6.tomshardware.com/cpu/20030422/opteron-02.html

There's a video of it in action on the page, as well.

 

The whole article is loaded with handy benchmarks and info to show the Opteron performance versus the Xeon.

 

Long article short, the Dual Opteron(1.8Ghz) is neck and neck with the dual Xeon(3.0Ghz). But, the dual Opteron was much, much cheaper, last time I checked.

Link to comment
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...