SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : 3DFX -- Ignore unavailable to you. Want to Upgrade?


To: John Finley who wrote (9446)12/8/1998 11:28:00 PM
From: Jeff Lins  Respond to of 16960
 
from VE:

Just checking out the latest buzz on Sierra's 3D RTS (set in space), Homeworld over on
Homeworld.org and spotted the following nugget of news (posted by Keith Hentschel, one of
graphics programmers):

As far as graphics cards go, Homeworld has been predominantly developed using
Voodoo Graphics and Vooodo^2's w/ OpenGL, not only because of the quality of
3Dfx's boards but because of our choice of development OS - NT 4, which lacks a
recent version of DirectX. We support OpenGL & Direct3D (DirectX 6), and I
recommend using OpenGL if you've got a Voodoo and Direct3D otherwise (most
vendors are putting more optimization effort into their D3D drivers, but quality of
OpenGL support has improved by leaps and bounds in the past few months
especially). Myself, I've switched over to using the TNT w/ D3D Homeworld -
fantastic looking output, and framerate remains quite high even at 1600x1200 (you
really have to see it to believe it).

A note about Glide: we currently have Glide support for 3Dfx cards and may ship
with it, but I have qualms about dedicating my time to support a vendor's proprietary
API, regardless of their marketshare. I'll take another look at Glide after the final
round of OpenGL optimizations is done and if 3Dfx's hardware runs just as fast in
OpenGL as in Glide, Glide support will be axed (and not missed).



To: John Finley who wrote (9446)12/8/1998 11:34:00 PM
From: Jeff Lins  Read Replies (1) | Respond to of 16960
 
More from VE from the designer of a game which has won two very prestigeous awards: latest game ever to ship and buggiest game ever. Ever. :

Gimpy at AMDZone has posted an interview with Take Two's favorite guy, Dr. Derek Smart,
the guy behind Battlecruiser. A little rogering for ya:

Q: In other interviews you've stated that BC3K 3020 would use 16-bit textures
and lighting. With Voodoo 3, the biggest complaints have been its lack of
32-bit color and its low resoulution texture size (256x256). Do you feel that
this is important in games, and will you be doing 32-bit color and bumping
up the texture size in BCK3K 3020? Which video card do you personally like
the most and to work with?

In general scheme of things, I don't think I'm going to miss much by not having
32-bit color. The industry is still wowing the gamers over 16-bit color rendering right
now anyway and I don't see 32-bit being that much of a big deal. Besides, in my
opinion, 1999-2000 will be the period during which the industry shifts focus back to
what counts: gameplay. The 1997-98 period has been the period of glitz with games
going from mediocre to just plain bad and hiding all that crap behind glitz. It sold for
awhile when a 3D card was a novelty. Its no longer a novelty, its a must-have and as
such, will go the way of the sound card revolution. I think we'll do just fine with 16-bit
until Y2K. If anyone's going to harp it, it'll be those who want to be the first on the
bandwagon. The fact of the matter is, let them, the rest of us will hop on at some
point if we deem it feasible and if the gamers demand it. By that time, I'm sure the
other cards lacking in this arena, Voodoo 4 (?), will follow suit.

We have no intentions of doing 32-bit color in the near future because I do not feel
that it is important for the reasons stated above