Supported hardware question

Advanced OpenGL source port fork from ZDoom, picking up where ZDoomGL left off.
[Home] [Download] [Git builds (Win)] [Git builds (Mac)] [Wiki] [Repo] [Bugs&Suggestions]

Moderator: Graf Zahl

User avatar
Enjay
Developer
Developer
Posts: 4748
Joined: Tue Aug 30, 2005 23:19
Location: Scotland
Contact:

Re: Supported hardware question

Post by Enjay »

I'd be surprised if any crash reports from the new renderer were much use at this stage. It is so early on in its development and there is still so much to do and Graf expects it to crashout with unsupported hardware anyway.
User avatar
Remmirath
DRD Team Admin (Inactive)
Posts: 528
Joined: Fri Feb 15, 2008 19:43
Location: Somewhere...
Contact:

Re: Supported hardware question

Post by Remmirath »

I know, i just wanted to be sure if i need to replace my videocard or not. :P
SoulPriestess wrote:Good job, Morpheus! You are teh awesum!
User avatar
Graf Zahl
GZDoom Developer
GZDoom Developer
Posts: 7148
Joined: Wed Jul 20, 2005 9:48
Location: Germany
Contact:

Re: Supported hardware question

Post by Graf Zahl »

Can't say. The code is too much in a state of flux right now. I'm not trying to make it stable. Currently I'm only making sure it works on my system. Please don't bother with crash logs. The most likely cause is that the renderer tries to get a GL API that does not exist in the driver being used.
User avatar
Graf Zahl
GZDoom Developer
GZDoom Developer
Posts: 7148
Joined: Wed Jul 20, 2005 9:48
Location: Germany
Contact:

Re: Supported hardware question

Post by Graf Zahl »

Now here's something I really did *NOT* expect:

Everybody seems to say that VBO's (vertex buffers, meaning to store the vertex data directly in video RAM is the thing of the future because it's so much faster, yadda, yadda, yadda.

So I set everything up to use VBOs and now was finally able to make a first test of the new renderer with only flats being rendered (using MAP09 of Hellcore while looking down onto the village from the church steeple which is a great scene to test rendering speed)

The results:

- Old renderer 52 fps
- New renderer 48 fps

That can't be, I thought so I did some tests. First I commented out some unneeded shader code for this scene. The result: 49 fps, i.e. not worth the overhead needed.

Next I commented out some fields in the vertex structure that weren't needed. Now the FPS rate went up to 52. Better but still not worth it.

Now I got really curious and converted the code to the old (and supposedly obsolete) immediate mode. The surprising result: 60 fps. Now I have to ask myself: Why do I have to jump through hoops and write the most convoluted garbage code to get the data to the GFX card in an 'ootimized' way when the simpler and much more convenient older method works so much better?

Makes no sense to me...
User avatar
Nash
Developer
Developer
Posts: 1226
Joined: Sun Sep 25, 2005 1:49
Location: Kuala Lumpur, Malaysia
Contact:

Re: Supported hardware question

Post by Nash »

Heh, it runs for me with r439... but all I get is a black screen. Game still runs though (I can move around and shoot, kill monsters, but I'll only hear sounds).

I'm not seeing the flats as Enjay describes it...

But hey I know it's still heavy WIP stuff. The fact that it runs on my system is all I needed to know. ;)

ATI 4870 X2
User avatar
Enjay
Developer
Developer
Posts: 4748
Joined: Tue Aug 30, 2005 23:19
Location: Scotland
Contact:

Re: Supported hardware question

Post by Enjay »

Nash wrote:I'm not seeing the flats as Enjay describes it...
Just for the hell of it really (r447)...

Image

Image
User avatar
Janitor
Persecution Complex
Posts: 123
Joined: Sun Jul 08, 2007 17:16
Location: Not in Kansas anymore

Re: Supported hardware question

Post by Janitor »

I guess I'm going to hop onto this and say "I'm fine with this as long as we see some new features." (/me looks at legacy's Corona effects.) I have no issues upgrading the good ol' GFX card. I have an nvidia GeForce 8600 GT. I imagine this should be sufficient?
User avatar
Graf Zahl
GZDoom Developer
GZDoom Developer
Posts: 7148
Joined: Wed Jul 20, 2005 9:48
Location: Germany
Contact:

Re: Supported hardware question

Post by Graf Zahl »

Of course. I got the same.
User avatar
Gez
Developer
Developer
Posts: 1399
Joined: Mon Oct 22, 2007 16:47

Re: Supported hardware question

Post by Gez »

Graf Zahl wrote:Now I got really curious and converted the code to the old (and supposedly obsolete) immediate mode. The surprising result: 60 fps. Now I have to ask myself: Why do I have to jump through hoops and write the most convoluted garbage code to get the data to the GFX card in an 'ootimized' way when the simpler and much more convenient older method works so much better?

Makes no sense to me...
There is the possibility that on future hardware immediate mode will be comparatively slower than VBOs.

Also, I figured you might not be the only one who stumbled upon this dilemma, so I googled for arguments between IM and VBO, and found this. Also this too.
User avatar
Graf Zahl
GZDoom Developer
GZDoom Developer
Posts: 7148
Joined: Wed Jul 20, 2005 9:48
Location: Germany
Contact:

Re: Supported hardware question

Post by Graf Zahl »

Well, yeah. I suspect the second case. With Doom it's that VBO's don't map well to the needed functionality because I need to carry all fields along that I *might* need - even if it's only for one polygon in the entire set.

Another issue is that in order to use a streaming VBO I need to make a third pass over the data. The end result is that I lose on the CPU side what I might gain on the GPU side.

The really funny thing is that the last driver I installed seems to have significantly boosted the performance of immediate mode. It was quite amazing how much faster GZDoom is now compared to the last driver I used.

After all this I stopped to believe in the rampant pro-VBO propaganda. VBOs are so rigid that a driver may often be in a better position to get the data to where it is needed rather than preemptively allocating a large buffer that needs to include fields with only minimal use.
User avatar
Graf Zahl
GZDoom Developer
GZDoom Developer
Posts: 7148
Joined: Wed Jul 20, 2005 9:48
Location: Germany
Contact:

Re: Supported hardware question

Post by Graf Zahl »

I think I need to make an announcement:

I have put the renderer rewrite on hold indefinitely. My initial plans did not work out and instead of the hoped performance gains it actually got slower!

Before I invest any more work here I have to do some experiments to see what kinds of optimizations actually can increase the rendering speed but as it looks the stuff I hoped would gain most actually had no effect whatsoever while causing a lot of work - so it's a waste of time to pursue this course of action any further.
User avatar
Enjay
Developer
Developer
Posts: 4748
Joined: Tue Aug 30, 2005 23:19
Location: Scotland
Contact:

Re: Supported hardware question

Post by Enjay »

Well, that's unfortunate. You have already sunk a good number of hours into it. I suppose it's good that you have realised this now rather than later though.

Do you have any thoughts as to how you might progress as and when you look at it again? Clearly you aren't happy with the renderer as it stands ATM either.
User avatar
Graf Zahl
GZDoom Developer
GZDoom Developer
Posts: 7148
Joined: Wed Jul 20, 2005 9:48
Location: Germany
Contact:

Re: Supported hardware question

Post by Graf Zahl »

The one thing I got out of these tests is that the flat rendering mechanism is almost as good as it gets. I tried keeping things precalculated but the speed difference was so small it was hardly worth it (less than 0.1 ms on a large level.) and most of that probably was due to the still missing lighting calculations which I need to keep. That's not worth completely redoing the code so I might as well scrap what I started there. Fortunately this is the only part that will get scrapped. The rest I already did works fine.

The code I am least satisfied about in the old renderer is the wall rendering. It wastes a good amount of time setting things up, especially if there's lots of 3D floors. But after my tests I'll approach this differently than I planned.
Sussudio
Posts: 98
Joined: Tue Jul 14, 2009 21:49

Re: Supported hardware question

Post by Sussudio »

When GZDoom detects the capabilities of a video card, where does he get this info from? video drivers? by directly testing the hardware?
User avatar
Gez
Developer
Developer
Posts: 1399
Joined: Mon Oct 22, 2007 16:47

Re: Supported hardware question

Post by Gez »

Sussudio wrote:When GZDoom detects the capabilities of a video card, where does he get this info from? video drivers?
That.

To query directly the hardware, you'd have to have querying routines dedicated to each different existing hardware... Possible, maybe, reasonable, certainly not!

That's why when you update your drivers, your chipset may become able to do things it couldn't do. And if your drivers aren't updated, on the other hand... For example, I have a GeForce 9600m GT. The hardware is capable of supporting OpenGL 3.0, PhysX and CUDA. But it's a MSI computer and for some reason MSI decided to prevent its laptops from using Nvidia/ATI's own drivers (trying to install Nvidia's latest drivers result in an error as it cannot identify the chipset), so I'm stuck with their custom version of the 176.88 branch which is restricted to OpenGL 2.1, no PhysX, no CUDA. (Nvidia's drivers are currently in the 186.40 branch last time I checked...)
Locked

Return to “GZDoom”