Brave VAAPI
I know this is a bazzite sub but was wondering if I could get some help in fedora. I've gotten VAAPI working with brave. The problem is the decoding occurs on the dGPU. It seems like it is choosing /dev/dri/renderD128. Is there any way to get it to use renderD129 which is the iGPU? Running on a Framework 16
4 Replies
no idea
you can try use
switcherooctl launch commandhere --arg1 --arg2 --arg3 1
or use the 2nd gpu
the number at the end controlds the gpu used for the application
1
will be your 2nd gpu 0
would be the first gpu
if that works you can use the
in the .desktop file to automate using gpu 1 instead of gpu 0No, unfortunately that didn't work. Here are the launch option for brave that I am using.
MESA_VK_DEVICE_SELECT=1002:15bf! MESA_VK_DEVICE_SELECT_FORCE_DEFAULT_DEVICE=1 brave-browser --enable-features=VaapiVideoDecoder,VaapiIgnoreDriverChecks,Vulkan,DefaultANGLEVulkan,VulkanFromANGLE --ignore-gpu-blocklist --disable-gpu-driver-bug-workaround
I have a feeling it's because I am using Vulkan as the backend that this is happening.
dont think you want that
!
in the MESA_VK_DEVICE_SELECT=1002:15bf!
Negative. I even tried in a container since that does seem to limit the GPU. Oh well, I can go back to using firefox or librewolf. Thanks for the help!
So I guess I got it working in the dumbest way possible. I guess chromium, brave, etc, are hard coded to use /dev/dri/renderD128. I got superGfxCtl working on the framework 16. With the dGPU enabled you can do an mv on /dev/dri/renderD128 to something like renderD127 and then make a symbolic link to /dev/dri/renderD128