Message edited by: Don Reba 03/22/2007 16:54:39
---QUOTATION--- How is it that Farcry, a game that came out in March of 04 (3 years ago) and had large open areas and lots of vegetation, as well as complex water and particle effects, can run just fine without requiring a $350-400 graphics card and still look much better than Stalker? ---END QUOTATION--- In two words - more scalable performance at the expence of higher minimum requirements.
---QUOTATION--- So explain to me, how is it that every other game that has HDR (Farcry, HL2 for example) has an option to turn it off, yet Stalker does not? ---END QUOTATION--- Seeing how keen you are on technical details, it seems best to refer you to the book GPU Gems 2, which describes STALKER's approach to rendering.
---QUOTATION--- HL2 and Far Cry not have HDR, not Deferred Rendering , not have paralax mapping an not have dinamic shadow mapping . Is another generation - game and engine- . This features a Unreal 3 and Crytek 2 engine features. Unreal 3 engine demo in year 2004 run in two 6800 ultra. The X Ray engine - dx 9 render - is a year 2004 engine, like Unreal 3 engine.
And recomended specs - game readme
nVIDIA® GeForce™ 7900/ATI Radeon® X1950 with 256 MB
Intel Core 2 Duo E6400 / AMD 64 X2 4200
1.5 GB RAM
who care if it has deferrred light rendering crap and parralax mapping??
as long as the game looks good and plays good..and wtf are you thinking how the hell is xray a next gen engine?? the textures are crap so are the models and animation.. ---END QUOTATION---