ProjectsWhat's NewDownloadsCommunitySupportCompany
Forum Index » S.T.A.L.K.E.R.: Shadow of Chernobyl Forum » Mod discussion
My Global Illumination tests

1 2 3 4 5 ... 8
Posted by/on
Question/AnswerMake Newest Up Sort by Descending
  15:20:28  5 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
 

Message edited by:
ket
02/05/2012 16:12:05
Messages: 1432
My Global Illumination tests

As graphically maxing SoC out is the theme of the moment right now I thought I would apply my graphical knowledge and understanding of GI and how X-Ray uses GI to do some experiments. Heres the results;

32 Photons:
http://img.photobucket.com/albums/v187/bizket/GI32.jpg

64 Photons:
http://img.photobucket.com/albums/v187/bizket/GI64.jpg

64 Photons (performance mode):
http://img.photobucket.com/albums/v187/bizket/GI64-93.jpg

128 Photons:
http://img.photobucket.com/albums/v187/bizket/GI128-90.jpg

128 Photons (performance mode):
http://img.photobucket.com/albums/v187/bizket/GI128-85.jpg

For those that aren't sure where to look, look at the desk lamps in the foreground and background at how they cast their light. Effectively, the higher the Photons count the more accurate you can make GI, in some cases you can get more accurate GI at better FPS than less accurate lower quality GI. I deffinately don't recommend messing with GI paramaters unless you know exactly how it works because the changes needed to be made are very fine (I'm talking 0.0000(1) adjustments here) and most people would simply lose patience

Do note, Global Illumination is a VERY demanding feature in SoC, it can, and will, bring your GPU and CPU to its knees if you don't pick realistic GI levels modern GPU / CPUs can handle.
  18:33:46  5 February 2012
profilee-mailreply Message URLTo the Top
Meltac
messing with code
(Resident)

 

 
On forum: 01/21/2010
 

Message edited by:
Meltac
02/05/2012 18:34:43
Messages: 1519
For sure interesting. Especially I'm appreciating the "maxing out" thinking that is currently making the round However:


---QUOTATION---
Do note, Global Illumination is a VERY demanding feature in SoC, it can, and will, bring your GPU and CPU to its knees if you don't pick realistic GI levels modern GPU / CPUs can handle.
---END QUOTATION---



That's the main reason why many people (including me) have disabled GI at all and won't benefit from your disclosures
  20:30:58  5 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
 

Message edited by:
ket
02/05/2012 20:35:15
Messages: 1432
You'll notice the FPS counter in those screenshots, thats with a 1GB GTX460 @ 900 / 4200, meaning those settings are very viable for anyone with a OC'd GTX460 or better. In fact, a OC'd GTX470 or better could easily handle any of the fine tuned GI settings I'm using in those screenshots comfortably, perhaps with a exception for the 128 Photons setting thats meant very much for "next gen" GPUs / CPUs.

Its not easy optimising GI in SoC, theres far more to it than simply enabling GI and setting Photons count, but the results are very much worth it.
  21:19:43  5 February 2012
profilee-mailreply Message URLTo the Top
Seeker_of_Strelok
Senior Resident
 

 
On forum: 05/02/2011
Messages: 279
G.I. Tests

Any plans on using these in your new beta / finished mod?
  22:02:47  5 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
 

Message edited by:
ket
02/05/2012 22:35:32
Messages: 1432

---QUOTATION---
G.I. Tests

Any plans on using these in your new beta / finished mod?
---END QUOTATION---



Yep, I have plans to include various files for GI with the next TK beta. There will be files ranging from 16 - 128 Photons for users to choose from with a accompanying SS of each GI implementation to help them decide which will run best on their system. I've just finished testing 64 Photons with my OC'd system and while it can handle it there are a few times when FPS dips below what people would consider desirable (about 17FPS) so I'd say a OC'd system with a GTX560 448 core should be just enough to keep framerate acceptable. Alternatively using the 64 Photons Performance profile could be a decent compromise.
  23:36:51  5 February 2012
profilee-mailreply Message URLTo the Top
EngineOfDarkness
(Senior)
 
On forum: 10/24/2008
Messages: 67
This thread made me actually search my login credentials (I thought someone posted a nearly flawless GI implementation )

Did you check different places in different levels?

From what I played around with GI in my Stalker playthroughts I always found areas where it outright "sucked" so to speak when I turned it on.

I then proceeded to play around with the Settings until it looked good - only to realise it looked bad in other places.

E.g. "flickering" light, light bleeding through walls, lightsources vanishing when you turn a certain amount of degrees away form them - as if they are culled from the view(e.g. the artefact (the highest tier of moonlight) in the cordon tunnel (under the dam) - not sure if its only in mods though, haven't played vanilla for a long time).

This problems where always show stoppers for me, as I notice this and this instantly annoys me.

Same with the shadows. While the "weird shadow edge at certain angles" shadow thing is rather easy to fix, the other Shadow bugs are not (like the shadow "stripes" if you zoom into some far away building, or shadows popping up/ fading at certain angles).
  00:27:17  6 February 2012
profilee-mailreply Message URLTo the Top
dezodor
level designer
(Resident)

 

 
On forum: 04/08/2007
Messages: 3803

---QUOTATION---
This thread made me actually search my login credentials (I thought someone posted a nearly flawless GI implementation )

---END QUOTATION---



gi in xray engine is halfway done, and not optimized at all, thats why they kept it turned off by default...
  01:14:37  6 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432

---QUOTATION---
This thread made me actually search my login credentials (I thought someone posted a nearly flawless GI implementation )

Did you check different places in different levels?

From what I played around with GI in my Stalker playthroughts I always found areas where it outright "sucked" so to speak when I turned it on.

I then proceeded to play around with the Settings until it looked good - only to realise it looked bad in other places.

E.g. "flickering" light, light bleeding through walls, lightsources vanishing when you turn a certain amount of degrees away form them - as if they are culled from the view(e.g. the artefact (the highest tier of moonlight) in the cordon tunnel (under the dam) - not sure if its only in mods though, haven't played vanilla for a long time).

This problems where always show stoppers for me, as I notice this and this instantly annoys me.

Same with the shadows. While the "weird shadow edge at certain angles" shadow thing is rather easy to fix, the other Shadow bugs are not (like the shadow "stripes" if you zoom into some far away building, or shadows popping up/ fading at certain angles).
---END QUOTATION---



Once I get to the lab levels again (lost my saves when a bad USB driver nuked my Win7 install) I'll make a video of GI. I've never experienced bugs with GI, the flickering you describe sounds like you had r2_allow_r1_lights enabled.
  01:18:01  6 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432

---QUOTATION---
This thread made me actually search my login credentials (I thought someone posted a nearly flawless GI implementation )


gi in xray engine is halfway done, and not optimized at all, thats why they kept it turned off by default...
---END QUOTATION---



Irellivent. that was then, this is now. Hardware is now powerful enough to utilise it as long as the wrench time is put in fine-tuning the settings for the best visual to performance ratio.
  01:39:50  6 February 2012
profilee-mailreply Message URLTo the Top
EngineOfDarkness
(Senior)
 
On forum: 10/24/2008
Messages: 67

---QUOTATION---

Once I get to the lab levels again (lost my saves when a bad USB driver nuked my Win7 install) I'll make a video of GI. I've never experienced bugs with GI, the flickering you describe sounds like you had r2_allow_r1_lights enabled.
---END QUOTATION---



Not that I remember, I always kept this setting of as it felt way to bright (and it was eating additional FPS).

Mind sharing your GI Setup? Maybe it is something which happens cause of the mod/ shaders i have installed (1.0004 SoC Version).

Gonna test later today when I am back from work.
  15:14:45  6 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432
I might do, my focus right now is largely finishing the TK mod rather than picking bits of it apart.
  21:29:46  6 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
 

Message edited by:
ket
02/06/2012 21:38:18
Messages: 1432
Alright, here we go. After many hours of tweaking, testing, and fine-tuning here is what I would consider the final results for optimal GI implementation. Do note these screenshots I have exaggerated GI slightly in order to highlight levels of accuracy between 70 - 128 Photons. (Look at the lamp in the background) As such you should ignore the FPS counter in the top left as default settings are not as demanding, hence FPS are higher. Eagle-eyed people may notice the FPS counter actually climb as Photon count increases. You should not take this as a sign to immediately use the 128 Photons file and ignore the others. The reason for the higher FPS as more Photons are used is simply because of much painstaking optimising I've managed to get GI to perform better with more Photons with simple scenes, under complex scene conditions higher Photon counts will result in lower FPS.


---QUOTATION---
Minimum Recommended System Config (70 Photons):

Highly clocked Dual or Quad core CPU
Highly OC'd 1GB GTX460

Recommended System Config: (96-128 Photons)

Highly clocked Quad core CPU
Highly OC'd GTX580
---END QUOTATION---



I'm not kidding with those specifications to run this GI implementation properly, so don't whine if you have a crap PC and it runs like crap.

My tests have found by running all of these files on the following system;

i5 2500k @ 4.5GHz
GTX460 @ 900 / 4200
Absolutely no junk processess running (download Regcleaner, click "Startup" tab to check your processes, delete all thats worthless)

I've found the minimum FPS to be as follows:

70 Photons: 16-17FPS (avg. 60FPS+)
96 Photons: 12-13FPS (avg. 60FPS+)
128 Photons: 8FPS (avg. 50FPS+)

With all that said and done, the screenshots;

70 Photons:
http://img.photobucket.com/albums/v187/bizket/GI70.jpg

96 Photons:
http://img.photobucket.com/albums/v187/bizket/GI96.jpg

128 Photons:
http://img.photobucket.com/albums/v187/bizket/GI128.jpg

Lastly, the wonderful thing with the settings I'm using for GI is that as you upgrade your hardware and as hardware further improves theres no need to adjust lots of settings, adjusting one setting in the file is now enough to improve GI further.
  02:42:15  7 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432
Heres a video, http://www.youtube.com/watch?v=XAPu1Wz-hno You'll have to ignore the slight video lag recording videos on my system with fraps always impacts the game. Actual performance is far better.
  22:36:27  10 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432
Second GI test video; http://www.youtube.com/watch?v=p_t1iUh011Y
  01:11:33  11 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432
Forgot to mention, the new video is a all new GI configuration I've been playing with, far less demanding on systems so should be suitable for pretty much all modern systems.
  01:54:44  11 February 2012
profilee-mailreply Message URLTo the Top
EngineOfDarkness
(Senior)
 
On forum: 10/24/2008
Messages: 67

---QUOTATION---
Second GI test video; http://www.youtube.com/watch?v=p_t1iUh011Y
---END QUOTATION---



Sure looks nice, but right at the beginning you see the problem which always put me off from using it - the white lights seem to penetrate geometry and "flash" overly much.

The orange one's don't seem to do that as much (thought you can notice that at 0:17 a little bit - below the catwalk there's orange light bleeding through on the wall) - wonder what's the difference - I've never modded stalker, but maybe this is something in the light config of the levels (or the config for these lightsources?) - something like lightning or another weird config setting?

There's a lot of other little nuisances which in the end put me off from going to use it when I played around with the gi settings.
  03:08:23  11 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
Messages: 1009
I wouldn't mind doing a test for you, I am on my new graphics card (Gigabyte 6870 Super OC) and on my mod with enhanced shaders, textures and loners PPX particles running at 90 FPS.

If you want to talk to me on Moddb, Search for Derranged.

Anyway, nice mod. It looks pretty good.
  03:53:11  11 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432

---QUOTATION---
Second GI test video; http://www.youtube.com/watch?v=p_t1iUh011Y

Sure looks nice, but right at the beginning you see the problem which always put me off from using it - the white lights seem to penetrate geometry and "flash" overly much.

The orange one's don't seem to do that as much (thought you can notice that at 0:17 a little bit - below the catwalk there's orange light bleeding through on the wall) - wonder what's the difference - I've never modded stalker, but maybe this is something in the light config of the levels (or the config for these lightsources?) - something like lightning or another weird config setting?

There's a lot of other little nuisances which in the end put me off from going to use it when I played around with the gi settings.
---END QUOTATION---



That stuff is simply due to the amount of Photons I'm using. Less Photons = less accurate GI. I'm having to fine-tune as I go adding Photons etc to get good performance and accuracy.
  03:58:05  11 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432

---QUOTATION---
I wouldn't mind doing a test for you, I am on my new graphics card (Gigabyte 6870 Super OC) and on my mod with enhanced shaders, textures and loners PPX particles running at 90 FPS.

If you want to talk to me on Moddb, Search for Derranged.

Anyway, nice mod. It looks pretty good.
---END QUOTATION---



I'll let you know if I need a tester Enabling GI will cleave your FPS in half I can guarantee that Its just a demanding feature.
  04:13:35  11 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
 

Message edited by:
Derranged
02/11/2012 4:14:12
Messages: 1009

---QUOTATION---
I wouldn't mind doing a test for you, I am on my new graphics card (Gigabyte 6870 Super OC) and on my mod with enhanced shaders, textures and loners PPX particles running at 90 FPS.

If you want to talk to me on Moddb, Search for Derranged.

Anyway, nice mod. It looks pretty good.

I'll let you know if I need a tester Enabling GI will cleave your FPS in half I can guarantee that Its just a demanding feature.
---END QUOTATION---



We will see about that

I want to see what this graphics card can handle. It runs BF3 on max settings at 60 FPS Average. Well, it just sits at 60 most of the time and only goes down when a huge explosion goes off in your face...
  13:52:39  11 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432
Performance-wise your card sits around the performance level of a GTX560 Ti, same as my GTX460. I hope the next gen of cards bring something worth upgrading to as even the GTX560 Ti 448 can only just barely scrape ahead of my GTX460
  15:24:03  11 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
 

Message edited by:
Derranged
02/11/2012 15:27:02
Messages: 1009
Wow... Also, newer cards do not support Pixel Shading which means any tweaking with shaders will make the game crash and request to to get Pixel Shading V1.1 or higher. Now, that comes with your graphics card. You can get an Emulator but most of the links are viruses.

Now, this might be another reason for me to test this out to see if it will do this. If it does then a lot of people won't like it when they cannot play o.o

Well, starters, does this have like... Meltacs shader base? If you are and you have turned on some of those options then that will cause a newer card to crash.

So, overall newer cards use Multi-Shading which older games with heavy shaders do not support.

EDIT: Hmm, it seems Loner1 has a weird discovery... His older card had the problem I am having but then he got 2 of that 1 card and run them in crossfire and it worked o.o
  18:26:00  11 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432
What issue are you having? Using Kingos shaders with some of my own shader tweaks I dont have any problems on any card I've tested my mod with (X1950Pro, HD3870, HD4830, HD5830, GTX460). The only difference between a GTX460 and a GTX560 (or any other GTX5xx for that matter) is the GPU architecture has been tweaked so you should, in theory, be able to get better OCs. Apart from that the architecture is identical with the exception of a few parts of the GPU being enabled/disabled based on what "level" of card you have.
  22:17:08  11 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432
Finally I've done it Using as few Photons as possible I've got a Global Illumination effect thats halfway decent and isn't massively demanding on a system, but you will still need a good CPU and GPU. A OC'd i5 2500k and a OC'd GTX460 should be enough to run this GI config without lag. The next GI step up from this I predict will only be viable in another 2 years (4 years for people that run only mid level hardware). I'm pushed for time atm so no video, just a few screens.

GI On:
http://img.photobucket.com/albums/v187/bizket/GIOn.jpg

GI Off:
http://img.photobucket.com/albums/v187/bizket/GIOff.jpg
  10:59:28  12 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
Messages: 1009
Yeah, Meltacs shaders which are based on Kingo's... But his are still basically the same with some tweaks. I asked him and he thinks there is a bug or something...
  14:38:38  12 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432
If there is a bug its a very obscure one, it hasn't shown itself on any of the cards I've tested with. I would put any issue you are having more down to a driver related problem.
  14:52:54  12 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
Messages: 1009
Well, I have downloaded the latest drivers as soon as I got the card. So it's not a driver problem (Unless they made an epic fail).
  17:32:10  12 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432
Its AMD, the drivers have sucked ever since AMD bought ATi. Drivers are very hit and miss now. I'd start by stepping back one driver version at a time and testing. Make sure you clean the drivers out properly each time too with driver sweeper.
  17:44:37  12 February 2012
profilee-mailreply Message URLTo the Top
EngineOfDarkness
(Senior)
 
On forum: 10/24/2008
Messages: 67

---QUOTATION---
Finally I've done it Using as few Photons as possible I've got a Global Illumination effect thats halfway decent and isn't massively demanding on a system, but you will still need a good CPU and GPU. A OC'd i5 2500k and a OC'd GTX460 should be enough to run this GI config without lag. The next GI step up from this I predict will only be viable in another 2 years (4 years for people that run only mid level hardware). I'm pushed for time atm so no video, just a few screens.

GI On:
http://img.photobucket.com/albums/v187/bizket/GIOn.jpg

GI Off:
http://img.photobucket.com/albums/v187/bizket/GIOff.jpg
---END QUOTATION---



Judging from the pics it looks good, also it seems that orange light bleeds through the catwalk with GI off anyways.

Well I could nitpick about something else, but I guess xRay is an engine where compromises have to be mate no matter what.

Gotta try the config out once your mod is released.
  18:55:50  12 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
Messages: 1009
I could revert back but then some newer games might not work. I am thinking about sending them a technical support message about this issue. Maybe then they will release a new driver which will fix it.
  19:11:48  12 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432
Thats highly unlikely. New games will work with old drivers, just sometimes the old driver might not perform as well as a newer driver. I doubt AMD will do anything even if you report the issue, they just suck at writing good drivers. Its why I moved away from AMD/ATi, I got sick of the rubbish drivers causing BSODs for no reason.
  02:18:03  13 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
Messages: 1009
I have never had a BSOD from a graphical driver and I have been with AMD/ATi for a long time.
  02:29:37  13 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432
I've had plenty, atim-something. The drivers just aren't as stable as nvidia drivers tend to be but I fully expect that to change in the future as its done so in the past. I vote with my wallet and what will give me best stability. Right now on both counts that nvidia but the new gen of cards that could likely change.
  02:41:21  13 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
Messages: 1009
Well, yeah. I wonder how many problems the new 7950, 7970's have (Count in the 7990 even though it isn't out).

I would be guessing the 7990 will cost around $1000 because it has 6 GB's of on board ram and is just insane.
  14:19:24  13 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432
BGA memory manufacturers really gouge the consumer for. On average, a manufacturer charges about £66 extra per additional 1GB on a graphics card. In contrast, a 1GB DDR3 memory stick (which also uses BGA memory) only costs £20-30. Thats how much the consumer is screwed with graphics cards that have more than 1GB vRAM. Its pretty outrageous, even kind of ironic. The cheapest GTX560 Ti 448 is about £200 without shipping, yet you can buy a GTX480 for around £160 with shipping. Thats a no-brainer, the latter is more powerful still and cheaper.
  14:45:54  13 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
Messages: 1009
Yeah, My 6780 Super OC gets better framerate then my old 5770, but it wasn't what I was expecting. I was actually expecting more from it.
  15:48:00  13 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
 

Message edited by:
ket
02/13/2012 15:51:59
Messages: 1432
The only way to get more out of your card is to OC it. Something thats always worth doing with higher mid-level hardware as the OC usually bumps that kind of hardware to levels comparable with relatively top end hardware For example; My GTX460 had stock clocks of 725/3600. Very ordinary and nothing overly outstanding. However, once I OC'd the card it reached fully stable clocks of 900/4200. Those clocks are enough to rival performance of cards such as a GTX470, GTX560 Ti, GTX560 Ti 448, etc.
  22:41:16  13 February 2012
profilee-mailreply Message URLTo the Top
Meltac
messing with code
(Resident)

 

 
On forum: 01/21/2010
 

Message edited by:
Meltac
02/13/2012 22:42:05
Messages: 1519

---QUOTATION---
Yeah, Meltacs shaders which are based on Kingo's... But his are still basically the same with some tweaks. I asked him and he thinks there is a bug or something...
---END QUOTATION---



Actually I meant that the error message you're having normally indicates a programming error in the shader files (syntax and/or semantics error - you might call it a bug if you really like).

I never had that message with issues whose cause was not that type of error, however I'm not entirely sure whether it couldn't be some other reason.

Just to have mentioned it, did you also check directx version? Sometimes a game engine requires a very specific version, and even newer ones seem not to be entirely backwards compatible. So it might be at least a try to check out some other DX9 version.

In any case, I do not see any reason why the vanilla shaders should be working and some other shader mod would not - apart from programming errors. But since GPU technology is a complex field I might be wrong.
  01:24:17  14 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432
I'm well versed with GPUs, CPUs... pretty much any hardware really and a good deal of software. A modern GPU will work on the oldest of games, you just might experience weird bugs in games older than DX9 because drivers are no longer optimised for those games. As for DX, I use redist june 2010 (still the newest) and I don't get a problem in any games or other software.
  03:20:32  14 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
 

Message edited by:
Derranged
02/14/2012 10:32:42
Messages: 1009
Since you told me which date the direct X driver is I am download it right now to see if that helps.

EDIT: They didn't help... Damn...
  11:17:59  14 February 2012
profilee-mailreply Message URLTo the Top
Meltac
messing with code
(Resident)

 

 
On forum: 01/21/2010
Messages: 1519

---QUOTATION---
I'm well versed with GPUs, CPUs... pretty much any hardware really and a good deal of software. A modern GPU will work on the oldest of games, you just might experience weird bugs in games older than DX9 because drivers are no longer optimised for those games. As for DX, I use redist june 2010 (still the newest) and I don't get a problem in any games or other software.
---END QUOTATION---



If so, do you by chance have any suspicion what the problem might be in Derranged's case? If it's not a syntax error nor a hardware nor driver issue, what else might be causing a new GPU not to work with my shaders?
  11:44:16  14 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
Messages: 1009
It could possibly be my graphics card it'self, dunno. I just really need to get this to start working.
  12:10:07  14 February 2012
profilee-mailreply Message URLTo the Top
Meltac
messing with code
(Resident)

 

 
On forum: 01/21/2010
Messages: 1519

---QUOTATION---
It could possibly be my graphics card it'self, dunno. I just really need to get this to start working.
---END QUOTATION---



What other shader packs did you test? Only vanilla, or maybe Sky4ce or Kingo's? And which options did you enable to produce that crash?
  14:00:45  14 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432

---QUOTATION---
I'm well versed with GPUs, CPUs... pretty much any hardware really and a good deal of software. A modern GPU will work on the oldest of games, you just might experience weird bugs in games older than DX9 because drivers are no longer optimised for those games. As for DX, I use redist june 2010 (still the newest) and I don't get a problem in any games or other software.

If so, do you by chance have any suspicion what the problem might be in Derranged's case? If it's not a syntax error nor a hardware nor driver issue, what else might be causing a new GPU not to work with my shaders?
---END QUOTATION---



It could be a good old fashioned case of the GPU overheating. New shaders with SoC really work a GPU. Even with my system my GPU heats up to around 76c while playing SoC with the new shaders. If its not that I would look into CPU and RAM stability (Intel Burn Test for CPU, HCI memtest for RAM) as they would be other suspects to look at. Depending how old the system is the mainboard chipset & VRMs (in the latters case, if they are heatsinked) could need a new application of TIM, same applied for the CPU.

If none of that fixes the problem I would start looking at software errors. Best way to fix that is to download and use pcdoc pro and tuneup utilities. Once those applications have done their thing run tuneup utilities registry optimiser so it can have a proper look at the registry and not just a quick look which is all it does if you run 1 click maintinence. Once all that is done, downoad and use diskeeper to defrag the HDD then reinstall DX redist june 2010. Fire up Stalker, and see what happens. I wouldn't rule out a corrupt Stalker install at this point either.
  14:32:09  14 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
 

Message edited by:
Derranged
02/14/2012 14:37:13
Messages: 1009

---QUOTATION---
It could possibly be my graphics card it'self, dunno. I just really need to get this to start working.

What other shader packs did you test? Only vanilla, or maybe Sky4ce or Kingo's? And which options did you enable to produce that crash?
---END QUOTATION---



I haven't tested any others and all options made it crash. The shaders work though, it's just the options which make it fail.


---QUOTATION---
It could be a good old fashioned case of the GPU overheating. New shaders with SoC really work a GPU. Even with my system my GPU heats up to around 76c while playing SoC with the new shaders. If its not that I would look into CPU and RAM stability (Intel Burn Test for CPU, HCI memtest for RAM) as they would be other suspects to look at. Depending how old the system is the mainboard chipset & VRMs (in the latters case, if they are heatsinked) could need a new application of TIM, same applied for the CPU.

If none of that fixes the problem I would start looking at software errors. Best way to fix that is to download and use pcdoc pro and tuneup utilities. Once those applications have done their thing run tuneup utilities registry optimiser so it can have a proper look at the registry and not just a quick look which is all it does if you run 1 click maintinence. Once all that is done, downoad and use diskeeper to defrag the HDD then reinstall DX redist june 2010. Fire up Stalker, and see what happens. I wouldn't rule out a corrupt Stalker install at this point either.
---END QUOTATION---



Nah, my GPU isn't overheating when it crashes on startup, and when it is not even close to 40 degrees Celsius. It's packed with fans and I have set them to go faster when anything becomes hot.

It won't be the CPU or the Ram.. The ram is brand new and my previous ram did the same thing. I have had this CPU since I had my old 5770 and I used Meltacs shaders before with options turned on.

My PC is brand new so it is not old at all.

It's very weird.

I will have a look into those utilities.
  15:12:21  14 February 2012
profilee-mailreply Message URLTo the Top
Meltac
messing with code
(Resident)

 

 
On forum: 01/21/2010
 

Message edited by:
Meltac
02/14/2012 15:13:54
Messages: 1519

---QUOTATION---
I haven't tested any others and all options made it crash. The shaders work though, it's just the options which make it fail.
---END QUOTATION---



Sorry, I still don't get it.

By "options" I assumed you mean the settings that you can specify in the ShaderSettings.txt file. Then if so, what do you mean by "all options made it crash"? I mean, that whole file consists only of options, and most of them are enabled by default, so did you first disable all of them and the shader worked, and after you re-enabled one of them it crashed? Or what exactly did you do to ensure the shader itself works?

@ket: I understand there might be several hardware/drivers/firmware or whatever reasons to make the game crash with some GPU-intenive shader pack (like mine).

However, what I do not understand: The error message he's getting is thrown while the shaders' source code is compiled upon game startup. So as I said it normally indicated that something in the shader's source code is not valid.

Now at that point there has none of the shaders been actually executed yet, so why should a GPU-/CPU-/RAM-related issue arise in that situation? If it was one, shouldn't the game crash earliest in the main menu, or even more likely, after a savegame has been reloaded, i.e. when the engine actually starts shader processing?
  16:10:37  14 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
Messages: 1009

---QUOTATION---
I haven't tested any others and all options made it crash. The shaders work though, it's just the options which make it fail.

Sorry, I still don't get it.

By "options" I assumed you mean the settings that you can specify in the ShaderSettings.txt file. Then if so, what do you mean by "all options made it crash"? I mean, that whole file consists only of options, and most of them are enabled by default, so did you first disable all of them and the shader worked, and after you re-enabled one of them it crashed? Or what exactly did you do to ensure the shader itself works?

@ket: I understand there might be several hardware/drivers/firmware or whatever reasons to make the game crash with some GPU-intenive shader pack (like mine).

However, what I do not understand: The error message he's getting is thrown while the shaders' source code is compiled upon game startup. So as I said it normally indicated that something in the shader's source code is not valid.

Now at that point there has none of the shaders been actually executed yet, so why should a GPU-/CPU-/RAM-related issue arise in that situation? If it was one, shouldn't the game crash earliest in the main menu, or even more likely, after a savegame has been reloaded, i.e. when the engine actually starts shader processing?
---END QUOTATION---



Yes, that is what I mean. Yes, you have some of them on but when I go to turn on more it just fails....
  16:19:50  14 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
 

Message edited by:
ket
02/14/2012 16:24:21
Messages: 1432

---QUOTATION---
It could possibly be my graphics card it'self, dunno. I just really need to get this to start working.

What other shader packs did you test? Only vanilla, or maybe Sky4ce or Kingo's? And which options did you enable to produce that crash?

I haven't tested any others and all options made it crash. The shaders work though, it's just the options which make it fail.

It could be a good old fashioned case of the GPU overheating. New shaders with SoC really work a GPU. Even with my system my GPU heats up to around 76c while playing SoC with the new shaders. If its not that I would look into CPU and RAM stability (Intel Burn Test for CPU, HCI memtest for RAM) as they would be other suspects to look at. Depending how old the system is the mainboard chipset & VRMs (in the latters case, if they are heatsinked) could need a new application of TIM, same applied for the CPU.

If none of that fixes the problem I would start looking at software errors. Best way to fix that is to download and use pcdoc pro and tuneup utilities. Once those applications have done their thing run tuneup utilities registry optimiser so it can have a proper look at the registry and not just a quick look which is all it does if you run 1 click maintinence. Once all that is done, downoad and use diskeeper to defrag the HDD then reinstall DX redist june 2010. Fire up Stalker, and see what happens. I wouldn't rule out a corrupt Stalker install at this point either.

Nah, my GPU isn't overheating when it crashes on startup, and when it is not even close to 40 degrees Celsius. It's packed with fans and I have set them to go faster when anything becomes hot.

It won't be the CPU or the Ram.. The ram is brand new and my previous ram did the same thing. I have had this CPU since I had my old 5770 and I used Meltacs shaders before with options turned on.

My PC is brand new so it is not old at all.

It's very weird.

I will have a look into those utilities.
---END QUOTATION---



Look at your PSU and let me know how many amps (A) the 12v rail has. If your 12v rail is weak that will be the source of your instabilities.
  16:56:31  14 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
 

Message edited by:
Derranged
02/14/2012 16:57:39
Messages: 1009
I am using a Top Brand PSU (Corsair HX520W Modular PSU) and I know it isn't the problem.

http://forums.overclockers.com.au/showthread.php?t=625374

On their it says it gets 18A on the 12V1, 12V2, 12V3...

EDIT: But, I am running on 2 PSU's...

My main Corsair and my Graphics card one...

My graphics card PSU is a Thermal Take one and I don't think it would be the problem.
  17:10:36  14 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432
Running on 2 PSUs generally isn't the brightest of ideas. 18A on each rail isn't overly high either. Even mid-range graphics cards tend to require a rail with a good 24A or more. I'm certain your issue is system specific and not anything to do with the game or shaders. Knowing every detail about your system would be useful.
  17:14:32  14 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
 

Message edited by:
Derranged
02/14/2012 17:28:43
Messages: 1009
How isn't it the brightest idea? The Tt PSU is specially designed to only run the graphics card...

http://www.thermaltakeusa.com/Product.aspx?S=1207&ID=1500

Look at it's specs.

EDIT: Theres is different from mine for some reason... The one I have isn't a main PSU. It comes with a cable which plugs into your main PSU and then you plug your graphics card cables into it. The stats from their website should be the same.
  17:24:42  14 February 2012
profilee-mailreply Message URLTo the Top
Meltac
messing with code
(Resident)

 

 
On forum: 01/21/2010
 

Message edited by:
Meltac
02/14/2012 17:27:12
Messages: 1519

---QUOTATION---
Yes, you have some of them on but when I go to turn on more it just fails....
---END QUOTATION---



That's what makes me very curious and far from convinced of a hardware issue.

I'd recommend you'd check out Kingo's Max Shaders, he provides several presets with different options enabled/disabled. Try if all of the presets work, or which of them fail. Furthermore, his shaders are shipped with a configuration GUI tool to edit all the options. Might be a try to check whether it makes a difference if you turn on the options with the tool rather and manually in your text editor (sounds a little weird but actually I had such an issue a while ago).

Edit:
Talking about PSUs, Thermal Take is overrated IMHO (I had several issues with that guy)
  17:26:52  14 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
 

Message edited by:
Derranged
02/14/2012 17:46:19
Messages: 1009

---QUOTATION---
Yes, you have some of them on but when I go to turn on more it just fails....

That's what makes me very curious and far from convinced of a hardware issue.

I'd recommend you'd check out Kingo's Max Shaders, he provides several presets with different options enabled/disabled. Try if all of the presets work, or which of them fail. Furthermore, his shaders are shipped with a configuration GUI tool to edit all the options. Might be a try to check whether it makes a difference if you turn on the options with the tool rather and manually in your text editor (sounds a little weird but actually I had such an issue a while ago).
---END QUOTATION---



Ok, I will have a go with his shaders.

EDIT: OK, his work like yours normally. Once I activate one of the options it starts to crash. Now, the option like this: //World effects

Is that one of the ones that I should be taking the // off of to get that whole heap to work?

EDIT 2: Ok, it died again so it is not me doing the wrong thing...
  18:29:25  14 February 2012
profilee-mailreply Message URLTo the Top
Meltac
messing with code
(Resident)

 

 
On forum: 01/21/2010
 

Message edited by:
Meltac
02/14/2012 18:31:36
Messages: 1519
//World effects

is not an disabled option but a "real" comment (i.e. intended to be read by humans, not computers). Deleting the slashes (//) will lead to a syntax error, thus a pixel shader 1.1 crash. Look for the real options under that line and try those.

EDIT:
Options are those lines that start with #define, so a disabled option looks like this:

//#define THIS_IS_MY_NEAT_OPTION

whereas an enabled option looks that way:

#define THIS_IS_MY_NEAT_OPTION
  00:28:06  15 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
 

Message edited by:
ket
02/15/2012 0:28:32
Messages: 1432

---QUOTATION---
How isn't it the brightest idea? The Tt PSU is specially designed to only run the graphics card...

http://www.thermaltakeusa.com/Product.aspx?S=1207&ID=1500

Look at it's specs.

EDIT: Theres is different from mine for some reason... The one I have isn't a main PSU. It comes with a cable which plugs into your main PSU and then you plug your graphics card cables into it. The stats from their website should be the same.
---END QUOTATION---



The problem is that the TT PSU draws its power from your main PSU. Basically this means your TT PSU relies on your main PSU to feed it power as it doesn't have any independent means of getting its own power, putting your main PSU under much more additional stess than it needs to be. In even simpler terms: main PSU 520w > trying to power a "secondary" 650w PSU that has no independent means of getting its power. Logic alone says you shouldn't use a PSU thats weaker than your secondary PSU to power your secondary PSU. I'm trying to think of a simple way to explain it but suffice to say; your main PSU is being put under a lot more stress than it needs. You still haven't told me the exact specification of your system so I can't say for sure but your secondary PSU as it seems atm is not needed at all. Your Corsair 520w PSU has enough juice to run your GPU as it is.
  04:23:25  15 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
 

Message edited by:
Derranged
02/15/2012 4:48:48
Messages: 1009

---QUOTATION---
How isn't it the brightest idea? The Tt PSU is specially designed to only run the graphics card...

http://www.thermaltakeusa.com/Product.aspx?S=1207&ID=1500

Look at it's specs.

EDIT: Theres is different from mine for some reason... The one I have isn't a main PSU. It comes with a cable which plugs into your main PSU and then you plug your graphics card cables into it. The stats from their website should be the same.

The problem is that the TT PSU draws its power from your main PSU. Basically this means your TT PSU relies on your main PSU to feed it power as it doesn't have any independent means of getting its own power, putting your main PSU under much more additional stess than it needs to be. In even simpler terms: main PSU 520w > trying to power a "secondary" 650w PSU that has no independent means of getting its power. Logic alone says you shouldn't use a PSU thats weaker than your secondary PSU to power your secondary PSU. I'm trying to think of a simple way to explain it but suffice to say; your main PSU is being put under a lot more stress than it needs. You still haven't told me the exact specification of your system so I can't say for sure but your secondary PSU as it seems atm is not needed at all. Your Corsair 520w PSU has enough juice to run your GPU as it is.
---END QUOTATION---



And that's where you go wrong, it doesn't draw power from the main PSU... It plugs into your main PSU's Graphics cables to take over the graphics cards. I have two power leads going to the power sockets in the wall. Dude, it's built for this sort of thing but mine doesn't have a power-cable which you plug into the front, it is hard-wired into the PSU... I should try and find it on the web.. A video I guess..

EDIT: I found the exact one: http://www.thermaltakeusa.com/Product.aspx?C=1265&ID=1544


---QUOTATION---
//World effects

is not an disabled option but a "real" comment (i.e. intended to be read by humans, not computers). Deleting the slashes (//) will lead to a syntax error, thus a pixel shader 1.1 crash. Look for the real options under that line and try those.

EDIT:
Options are those lines that start with #define, so a disabled option looks like this:

//#define THIS_IS_MY_NEAT_OPTION

whereas an enabled option looks that way:

#define THIS_IS_MY_NEAT_OPTION
---END QUOTATION---



Ok, but I edited that post and said that it crashed even with the normal options being taken off...
  12:02:46  15 February 2012
profilee-mailreply Message URLTo the Top
Meltac
messing with code
(Resident)

 

 
On forum: 01/21/2010
Messages: 1519

---QUOTATION---
Ok, but I edited that post and said that it crashed even with the normal options being taken off...
---END QUOTATION---



Derranged, could you please send me or upload the file that you have edited that makes the game crash?
  13:55:23  15 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
Messages: 1009
I don't think that will help you at all... I was using Kingo's program to use some of the settings. I wasn't manually setting them. If you really want them you may aswell just download his because they are the same.
  15:00:13  15 February 2012
profilee-mailreply Message URLTo the Top
Meltac
messing with code
(Resident)

 

 
On forum: 01/21/2010
Messages: 1519
Ok then, did you try all of Kingo's presets? Did all of them crash, or did some work (and which)?

And are you having always exactly the same error message? And always no log at all (I'm talking about the xray log file that is written to where the screenshots and savegames are stored on your machine)?

Sorry for all those questions, but I'm just trying to figure out what in the world might cause such a weird behavior.
  17:31:53  15 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
 

Message edited by:
Derranged
02/15/2012 17:32:17
Messages: 1009
If you download it and play it works. When you edit them through even his editor it crashes.

It is always the Pixel Shaders V1.1... Nothing else.
  18:54:20  15 February 2012
profilee-mailreply Message URLTo the Top
Meltac
messing with code
(Resident)

 

 
On forum: 01/21/2010
Messages: 1519

---QUOTATION---
If you download it and play it works. When you edit them through even his editor it crashes.

It is always the Pixel Shaders V1.1... Nothing else.
---END QUOTATION---



Hmm, seems to me as if the file gets somehow corrupted when you save it. Extremely weird... apart from the default recommendation to re-install everything on your machine including windows itself, I have no clue...
  20:41:16  15 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432

---QUOTATION---
How isn't it the brightest idea? The Tt PSU is specially designed to only run the graphics card...

http://www.thermaltakeusa.com/Product.aspx?S=1207&ID=1500

Look at it's specs.

EDIT: Theres is different from mine for some reason... The one I have isn't a main PSU. It comes with a cable which plugs into your main PSU and then you plug your graphics card cables into it. The stats from their website should be the same.

The problem is that the TT PSU draws its power from your main PSU. Basically this means your TT PSU relies on your main PSU to feed it power as it doesn't have any independent means of getting its own power, putting your main PSU under much more additional stess than it needs to be. In even simpler terms: main PSU 520w > trying to power a "secondary" 650w PSU that has no independent means of getting its power. Logic alone says you shouldn't use a PSU thats weaker than your secondary PSU to power your secondary PSU. I'm trying to think of a simple way to explain it but suffice to say; your main PSU is being put under a lot more stress than it needs. You still haven't told me the exact specification of your system so I can't say for sure but your secondary PSU as it seems atm is not needed at all. Your Corsair 520w PSU has enough juice to run your GPU as it is.

And that's where you go wrong, it doesn't draw power from the main PSU... It plugs into your main PSU's Graphics cables to take over the graphics cards. I have two power leads going to the power sockets in the wall. Dude, it's built for this sort of thing but mine doesn't have a power-cable which you plug into the front, it is hard-wired into the PSU... I should try and find it on the web.. A video I guess..

EDIT: I found the exact one: http://www.thermaltakeusa.com/Product.aspx?C=1265&ID=1544

//World effects

is not an disabled option but a "real" comment (i.e. intended to be read by humans, not computers). Deleting the slashes (//) will lead to a syntax error, thus a pixel shader 1.1 crash. Look for the real options under that line and try those.

EDIT:
Options are those lines that start with #define, so a disabled option looks like this:

//#define THIS_IS_MY_NEAT_OPTION

whereas an enabled option looks that way:

#define THIS_IS_MY_NEAT_OPTION

Ok, but I edited that post and said that it crashed even with the normal options being taken off...
---END QUOTATION---



You still haven't given me the full spec of your system so its extremely difficult to narrow things down. Even without knowing proper specs I can tell you now just because something is "built for this sort of thing", it does not mean you need it and you can put additional strain on a system. It looks like the TT unit draws its power from the 12v rail of your Corsair unit, putting your Corsair unit under more strain. Simply; Corsair unit > TT unit drawing power from Corsair unit > TT unit passes power to GPU. The Corsair unit is still powering everything. As all I really know about your system is that its "new", has a 6870 GPU and a Corsair 520w PSU I can only assume the system isn't stuffed to the gills with additional add-in boards, optical drives, HDDs and the like. In which case your 520w Corsair unit is perfectly sufficient for your GPU and the rest of the system. The TT unit is not required. It is worth removing the TT unit entirely and seeing if your system has any stability issues.
  06:07:49  16 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
Messages: 1009

---QUOTATION---
How isn't it the brightest idea? The Tt PSU is specially designed to only run the graphics card...

http://www.thermaltakeusa.com/Product.aspx?S=1207&ID=1500

Look at it's specs.

EDIT: Theres is different from mine for some reason... The one I have isn't a main PSU. It comes with a cable which plugs into your main PSU and then you plug your graphics card cables into it. The stats from their website should be the same.

The problem is that the TT PSU draws its power from your main PSU. Basically this means your TT PSU relies on your main PSU to feed it power as it doesn't have any independent means of getting its own power, putting your main PSU under much more additional stess than it needs to be. In even simpler terms: main PSU 520w > trying to power a "secondary" 650w PSU that has no independent means of getting its power. Logic alone says you shouldn't use a PSU thats weaker than your secondary PSU to power your secondary PSU. I'm trying to think of a simple way to explain it but suffice to say; your main PSU is being put under a lot more stress than it needs. You still haven't told me the exact specification of your system so I can't say for sure but your secondary PSU as it seems atm is not needed at all. Your Corsair 520w PSU has enough juice to run your GPU as it is.

And that's where you go wrong, it doesn't draw power from the main PSU... It plugs into your main PSU's Graphics cables to take over the graphics cards. I have two power leads going to the power sockets in the wall. Dude, it's built for this sort of thing but mine doesn't have a power-cable which you plug into the front, it is hard-wired into the PSU... I should try and find it on the web.. A video I guess..

EDIT: I found the exact one: http://www.thermaltakeusa.com/Product.aspx?C=1265&ID=1544

//World effects

is not an disabled option but a "real" comment (i.e. intended to be read by humans, not computers). Deleting the slashes (//) will lead to a syntax error, thus a pixel shader 1.1 crash. Look for the real options under that line and try those.

EDIT:
Options are those lines that start with #define, so a disabled option looks like this:

//#define THIS_IS_MY_NEAT_OPTION

whereas an enabled option looks that way:

#define THIS_IS_MY_NEAT_OPTION

Ok, but I edited that post and said that it crashed even with the normal options being taken off...

You still haven't given me the full spec of your system so its extremely difficult to narrow things down. Even without knowing proper specs I can tell you now just because something is "built for this sort of thing", it does not mean you need it and you can put additional strain on a system. It looks like the TT unit draws its power from the 12v rail of your Corsair unit, putting your Corsair unit under more strain. Simply; Corsair unit > TT unit drawing power from Corsair unit > TT unit passes power to GPU. The Corsair unit is still powering everything. As all I really know about your system is that its "new", has a 6870 GPU and a Corsair 520w PSU I can only assume the system isn't stuffed to the gills with additional add-in boards, optical drives, HDDs and the like. In which case your 520w Corsair unit is perfectly sufficient for your GPU and the rest of the system. The TT unit is not required. It is worth removing the TT unit entirely and seeing if your system has any stability issues.
---END QUOTATION---



Omg, did you read my other post? THE TT PSU DOESN'T DRAW POWER FROM THE MAIN PSU!

Do you want my full specs?

Mobo: Gigabyte 990FXAD3
Graphics: Gigabyte Radeon 6870 Super O/C
CPU: AMD Phenom 1055T
Memory: 16 GB Corsair Vengence 1600 Mhz (Coming today).
PSU/S: Corsair HX520W Modular Main, Tt 650W Graphics power supply.
  10:20:57  16 February 2012
profilee-mailreply Message URLTo the Top
Meltac
messing with code
(Resident)

 

 
On forum: 01/21/2010
Messages: 1519
Besides all that hardware discussion you're doing here - I'm fairly convinced the problem you're having with modifying shader config files is not caused by any Mainboard, PSU, CPU, RAM, or whatever issue.
  13:18:54  16 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
Messages: 1009
Then why when I have my old 5770 I was not getting this problem?
  14:27:55  16 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432

---QUOTATION---


Omg, did you read my other post? THE TT PSU DOESN'T DRAW POWER FROM THE MAIN PSU!

Do you want my full specs?

Mobo: Gigabyte 990FXAD3
Graphics: Gigabyte Radeon 6870 Super O/C
CPU: AMD Phenom 1055T
Memory: 16 GB Corsair Vengence 1600 Mhz (Coming today).
PSU/S: Corsair HX520W Modular Main, Tt 650W Graphics power supply.
---END QUOTATION---



I'm at work when I make these posts I don't have time for detailed looks at things it would of been much faster to just say its a auxillery PSU (which in themselves have proven to be suspect to reliability questions and quality issues) One example of which is here; http://www.hardocp.com/article/2011/01/24/epower_juice_box_450w_auxiliary_power_supply_review/9. Now thats cleared up, back to the point at hand. Your Corsair unit should be perfectly capable of running your system by itself, theres nothing overly taxing in the system. Heres a comprehensive breakdown for the Corsair unit you have; http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=18

That Corsair memory will run its stock rated timings and speed and nothing more if its rated CL9 or worse. So if you were thinking about trying to ramp it up to 1866MHz I'd forget it as the chances of it managing that are slim to none. I reviewed a 8GB kit of CL9 Vengeance memory last year, to say it was epic fail would be a colossal understatement. Right now G.Skill are top dogs for memory, nomatter what plateu you are looking at.

So, in summary, its still worth seeing how your 6870 runs without the TT unit. I also still agree with Meltac, the shaders work fine theres nothing wrong with them. For anyone curious about just what exactly it is I do (yes I'm looking at you Meltac to satisfy your curiosity ) I'm a PC hardware reviewer, I've previously done some R&D work for Mushkin memory and recently Asrock with their Z68 and P67 series of mainboards and I'm also a I.T. Engineer with MCSA, and MCSE qualifications among a plethora of other certificates. So there you go Meltac, now you know
  14:47:44  16 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
Messages: 1009
Well, I don't think it is the TT PSU. My system runs absolutely perfect on every other game and provides a nice frame rate. I wouldn't try running the system on just the 520W because this graphics card demands a lot of power.

ALSO, I was running the TT PSU on my old 5770 and it was able to run these shaders.
  11:30:19  17 February 2012
profilee-mailreply Message URLTo the Top
Meltac
messing with code
(Resident)

 

 
On forum: 01/21/2010
 

Message edited by:
Meltac
02/17/2012 11:36:56
Messages: 1519

---QUOTATION---
Then why when I have my old 5770 I was not getting this problem?
---END QUOTATION---



Well I didn't say it's not your GPU.

Try this, maybe it helps (although it's for CS):

https://www.gsc-game.com/index.php?t=community&s=forums&s_game_type=xr2&thm_page=5&thm_id=16790&sec_id=19

EDIT:
Also, have you tried uninstalling all GPU drivers and installing some older driver version? I had several issues with my two last graphic cards and in both cases it turned out that the latest drivers where not the best (at least not for the games I needed them to work with); so checking out a few (not only one) older drivers solved my issues - weird but true
  11:57:31  17 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
Messages: 1009

---QUOTATION---
Then why when I have my old 5770 I was not getting this problem?

Well I didn't say it's not your GPU.

Try this, maybe it helps (although it's for CS):

https://www.gsc-game.com/index.php?t=community&s=forums&s_game_type=xr2&thm_page=5&thm_id=16790&sec_id=19

EDIT:
Also, have you tried uninstalling all GPU drivers and installing some older driver version? I had several issues with my two last graphic cards and in both cases it turned out that the latest drivers where not the best (at least not for the games I needed them to work with); so checking out a few (not only one) older drivers solved my issues - weird but true
---END QUOTATION---



You know what, it actually was the SSOA... Except I had to turn it off in the Shaders file and not in-game. Thanks!
  12:04:52  17 February 2012
profilee-mailreply Message URLTo the Top
Meltac
messing with code
(Resident)

 

 
On forum: 01/21/2010
 

Message edited by:
Meltac
02/17/2012 12:05:16
Messages: 1519

---QUOTATION---
You know what, it actually was the SSOA... Except I had to turn it off in the Shaders file and not in-game. Thanks!
---END QUOTATION---



Yes, that's what I forgot to mention; in CS you switch the SSAO in the user.ltx, but in ShoC you'll do it in ShaderSettings.txt.

Glad you finally got it working
  12:14:05  17 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
Messages: 1009
Haha, thanks. Now I have to tweak it a fair bit to get as much as I can get out of it.
  15:03:43  17 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432

---QUOTATION---
Then why when I have my old 5770 I was not getting this problem?

Well I didn't say it's not your GPU.

Try this, maybe it helps (although it's for CS):

https://www.gsc-game.com/index.php?t=community&s=forums&s_game_type=xr2&thm_page=5&thm_id=16790&sec_id=19

EDIT:
Also, have you tried uninstalling all GPU drivers and installing some older driver version? I had several issues with my two last graphic cards and in both cases it turned out that the latest drivers where not the best (at least not for the games I needed them to work with); so checking out a few (not only one) older drivers solved my issues - weird but true

You know what, it actually was the SSOA... Except I had to turn it off in the Shaders file and not in-game. Thanks!
---END QUOTATION---



SSAO is buggy as hell in SoC because its a old build when CS was still in development. Your 6870 only requires 150w of power which is drawn from the 12v rail, your Corsair unit has 54A, which apparently is dragged down to 40A, probably because of the split rail configuration or something similar if it is dragged down. The unit should have plenty of oomph for your system and have room to spare. Even according to Thermaltakes system calculator you only need a PSU of around 430w.
http://www.thermaltake.outervision.com/Power
  15:50:25  17 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
Messages: 1009
Well, when I went to the store and I saw that PSU for that price I grabbed it not knowing what it was. Then I got home to put it in the case and then noticed it was a secondary PSU but whatever, my system runs fine how it is.
  16:42:21  17 February 2012
profilee-mailreply Message URLTo the Top
Meltac
messing with code
(Resident)

 

 
On forum: 01/21/2010
Messages: 1519

---QUOTATION---
SSAO is buggy as hell in SoC because its a old build when CS was still in development.
---END QUOTATION---



That's a well-known one. However, I still wonder why/how it makes some GPUs refuse to work at all - I would have expected some glitches or rendering misbehavior upon runtime, but not that just a bad shader implementation can make some GPUs die while others work like a charm
  19:05:34  17 February 2012
profilee-mailreply Message URLTo the Top
ket
Senior Resident
 

 
On forum: 01/13/2006
Messages: 1432
I don't think its anything to do with SSAO specifically, just the way certain GPU architectures are. Even that explanation still raises some questions though especially as I've run SSAO on a lot of different nvidia and ATi cards without any issues.
  06:47:41  18 February 2012
profilee-mailreply Message URLTo the Top
Derranged
Senior Resident
 

 
On forum: 04/12/2010
 

Message edited by:
Derranged
02/18/2012 12:04:26
Messages: 1009
Awesome, I tweaked the shaders just a little bit and so far I have been able to get a massive difference!

Old: http://dl.dropbox.com/u/26574229/Images/Photos/XR_3DA%202012-02-17%2014-32-08-05.png

New: http://dl.dropbox.com/u/26574229/Images/Photos/XR_3DA%202012-02-18%2014-42-15-53.png

I know I should've gotten a picture of the old shaders in that area but oh well.. Trust me when I say it is better.

Thanks guys on the help.

EDIT: Out of curiosity I disabled this line: #define SSAO_TEX_CONT

And then turned the main SSAO back on and it worked, but it looked the same... Maybe that line above was the problem. I did notice that after I disabled it the game became "Less dark" as stated in the note after it. Very strange...
  23:36:25  19 February 2012
profilee-mailreply Message URLTo the Top
Meltac
messing with code
(Resident)

 

 
On forum: 01/21/2010
Messages: 1519
No problem, dude

Would be interesting though to know what that SSAO_TEX_CONT thing does and why it can make the game crash on some GPUs...
  04:11:01  20 February 2012
profilee-mailreply Message URLTo the Top
Lyoko774
Senior Resident
 

 
On forum: 04/09/2007
Messages: 170

---QUOTATION---
No problem, dude

Would be interesting though to know what that SSAO_TEX_CONT thing does and why it can make the game crash on some GPUs...
---END QUOTATION---


It artificially increases texture contrast, but it's a bit odd that it seems to kill certain GPUs
  22:09:46  2 November 2013
profilee-mailreply Message URLTo the Top
Don Reba
Bishop and Councilor of War
(Moderator)

 

 
On forum: 12/04/2002
Messages: 11733
Too bad all the pictures in this thread have since been taken down.
  00:20:36  4 November 2013
profilee-mailreply Message URLTo the Top
Storm Shadow
A machine, a Shadow Machine.
(Resident)

 

 
On forum: 11/14/2007
Messages: 1430
@Ket, would you be so kind as to give a description in your own words of what each of the r2_gi_xxx cvars do?

Also, would you kindly post your GI settings that you've now settled on?

Regards,
 
Each word should be at least 3 characters long.
Search:    
Search conditions:    - spaces as AND    - spaces as OR   
 
Forum Index » S.T.A.L.K.E.R.: Shadow of Chernobyl Forum » Mod discussion
 

All short dates are in Month-Day-Year format.


 

Copyright © 1995-2022 GSC Game World. All rights reserved.
This site is best viewed in Internet Explorer 4.xx and up and Javascript enabled. Webmaster.
Opera Software products are not supported.
If any problem concerning the site functioning under Opera Software appears apply
to Opera Software technical support service.