change which gfx card gzdoom uses
Moderator: Graf Zahl
-
- Posts: 5
- Joined: Tue Jan 29, 2013 21:17
change which gfx card gzdoom uses
When I start GZDoom and go to the console, I see this line:
GL_RENDERER: MOBILE INTEL(R) HD GRAPHICS
In my device manager, under Display Adapters, I see this:
AMD Radeon HD 6490M
Mobile Intel(R) HD Graphics
My OS is Windows 7. Is there something in GZDoom's configuration I can use to change which graphics card it uses?
GL_RENDERER: MOBILE INTEL(R) HD GRAPHICS
In my device manager, under Display Adapters, I see this:
AMD Radeon HD 6490M
Mobile Intel(R) HD Graphics
My OS is Windows 7. Is there something in GZDoom's configuration I can use to change which graphics card it uses?
- Tiger
- Developer
- Posts: 863
- Joined: Thu Feb 25, 2010 3:44
- Location: United States
- Contact:
Re: change which gfx card gzdoom uses
Perhaps I could be wrong, but is it possible to disable one of the graphical devices within BIOS? For example, disabling Intel's GMA or ATi? This is possibly one solution that might help.
Nicholas 'Tiger' Gautier
- NeuralStunner
- Posts: 253
- Joined: Tue Dec 29, 2009 3:46
- Location: IN SPACE
- Contact:
Re: change which gfx card gzdoom uses
I'm not sure if vid_adapter would affect this. (If it's 0, you could try setting it to 1, and vice versa.)
Dean Koontz wrote:Human beings can always be relied upon to exert, with vigor, their God-given right to be stupid.
Spoiler: System Specs
- Enjay
- Developer
- Posts: 4723
- Joined: Tue Aug 30, 2005 23:19
- Location: Scotland
- Contact:
Re: change which gfx card gzdoom uses
That would be my suggestion too. Quite often onboard cards are automatically disabled when another card is added. It seems that hasn't happened with this setup. Sometimes they can be disabled via the bios settings.Tiger wrote:Perhaps I could be wrong, but is it possible to disable one of the graphical devices within BIOS? For example, disabling Intel's GMA or ATi? This is possibly one solution that might help.
-
- Posts: 5
- Joined: Tue Jan 29, 2013 21:17
Re: change which gfx card gzdoom uses
Hmm, I'll have to try disabling it in the BIOS. I'm having trouble getting to the BIOS on this machine at the moment but when I figure it out I'll post my findings. FWIW, disabling the onboard in the Device Manager causes GZDoom to crash as soon as it starts, but I don't know if the other card is even being seen when I do that.
-
- Posts: 5
- Joined: Tue Jan 29, 2013 21:17
Re: change which gfx card gzdoom uses
Apologies for the double, but I thought this might deserve a bump.
This is apparently an issue with the Samsung Series 7 laptops. They (for some reason) feature both integrated graphics and a dedicated GPU, and use some sort of dynamic switching. OpenGL applications don't trigger the switch to the dedicated card. See http://forum.notebookreview.com/samsung ... -bios.html for more details.
This is apparently an issue with the Samsung Series 7 laptops. They (for some reason) feature both integrated graphics and a dedicated GPU, and use some sort of dynamic switching. OpenGL applications don't trigger the switch to the dedicated card. See http://forum.notebookreview.com/samsung ... -bios.html for more details.
- Gez
- Developer
- Posts: 1399
- Joined: Mon Oct 22, 2007 16:47
Re: change which gfx card gzdoom uses
Sad to see yet another company that treats OpenGL as an inconsequential afterthought.
You might attempt to use one of these things (listed from most recent to most ancient):
http://code.google.com/p/qindie-gl/
http://titaniumgl.tk/
http://sourceforge.net/projects/dxglwrap/
http://www.majorgeeks.com/GLDirect_d381.html
You might attempt to use one of these things (listed from most recent to most ancient):
http://code.google.com/p/qindie-gl/
http://titaniumgl.tk/
http://sourceforge.net/projects/dxglwrap/
http://www.majorgeeks.com/GLDirect_d381.html
-
- Posts: 5
- Joined: Tue Jan 29, 2013 21:17
Re: change which gfx card gzdoom uses
I installed the fix from the thread I linked earlier and it actually gives me control over which graphics card I want an application to use, which is actually really nice. So if anyone else is having this problem and they bought their Samsung Series 7 before April-June-ish 2012, maybe give that a shot.
- Rachael
- Developer
- Posts: 3646
- Joined: Sat May 13, 2006 10:30
Re: change which gfx card gzdoom uses
I have tried version 1.0-rev3 of the above linked driver. It actually worked really well, on my old Intel GPU laptop, except for a few minor bugs. 1.0-rev4 does *NOT* work with GZDoom, however... as I have found.Gez wrote:Sad to see yet another company that treats OpenGL as an inconsequential afterthought.
You might attempt to use one of these things (listed from most recent to most ancient):
http://code.google.com/p/qindie-gl/
It should be noted that in order to get 1.0-rev3 to work in GZDoom, the .dll had to be hex-edited and the GL version had to be changed from 1.1 to 1.2. It is a simple string search-replace. After that it worked like a charm. When I get home, I will post a fixed .dll for people who have troubles running GZDoom ... it may be a solution for some people.
For the technically motivated, it is probably a lot better to simply recompile the .dll with the version changed instead of hex-editing it, since it is open-source, but the hex-edit solution is an effective one if you do not have a compiler installed.
- Gez
- Developer
- Posts: 1399
- Joined: Mon Oct 22, 2007 16:47
Re: change which gfx card gzdoom uses
...Or don't want to bother hunting down dependencies and tweaking project files.
- Rachael
- Developer
- Posts: 3646
- Joined: Sat May 13, 2006 10:30
Re: change which gfx card gzdoom uses
As promised...
The following did not work on the Intel GPU I tested it on:
Screen wipe effects do not work.
Texture mipmapping does not work. You must set texture filter to either "None" or "Linear" but you cannot use mipmapped.
I am pretty sure shaders do not work, but they are untested.
There may be a number of other things that also do not work.
Tested on SwiftShader 2.0, works ok, minus the problems listed above.
Also, tested on a NVidia GPU, and the screen wouldn't even draw except the menu and status bar; so I'd only recommend this if you're having problems already. Same problem occurs on SwiftShader 3.0.
On the ATI GPU I tested it on, the game simply crashed, but I do not know if it is at all related to this setup or not. The game crashed even using Swiftshader's Direct3D driver, which worked fine on the other two machines.
The following did not work on the Intel GPU I tested it on:
Screen wipe effects do not work.
Texture mipmapping does not work. You must set texture filter to either "None" or "Linear" but you cannot use mipmapped.
I am pretty sure shaders do not work, but they are untested.
There may be a number of other things that also do not work.
Tested on SwiftShader 2.0, works ok, minus the problems listed above.
Also, tested on a NVidia GPU, and the screen wouldn't even draw except the menu and status bar; so I'd only recommend this if you're having problems already. Same problem occurs on SwiftShader 3.0.
On the ATI GPU I tested it on, the game simply crashed, but I do not know if it is at all related to this setup or not. The game crashed even using Swiftshader's Direct3D driver, which worked fine on the other two machines.