Trying Out GPU Transcoding
Hi I wanted to know if there is any way to try the GPU transcoding feature early. I know it's still in development but I have about 10,000 videos I want to import into immich and if I would have to CPU transcode them it would take way too long. I would really appreciate any help anyone can give me π
77 Replies
You could try to use the dev image for your microservices container
ghcr.io/immich-app/immich-server:pr-3171
, but absolutely zero guarantee - it could horribly malfunction and garble all your videos or something like that, so at your own peril πso I just do a docker pull command?
how do I set that up? will it automatically use my nvidia gpu?
I think so
But it's probably better to wait until this is released with actual documentation and such ;)
any eta?
nope
ah ok I may have to try it since I'm mostly importing my videos not only using it as my storage, I can at least let you know how it goes.
It is hard to try it on your Immich instance. As I remember you need (1) to modify docker-compose.yml (see PR). Since hardware acceleration is disabled by default, you (2) will need also to pull the web image to change this setting
isin't it like 2 lines in the docker compose file

Not sure, but I believe also a couple of lines in the devices section
of the docker compose file?
Yep
@brighteyed @bo0tzz In the docker compose file Im assuming I replace the immich-microservices line
with immich-server:pr-3171 or do I replace this part?
Both
immich-server
and immich-microservices
containers use immich-server
imageso I replace both of the images with immich-server:pr-3171?
@brighteyed @bo0tzz
I would suggest to setup a separate Immich instance to try this experimental feature
it does not look like the gpu is in use, this is the output of nvidia-smi
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.54.03 Driver Version: 535.54.03 CUDA Version: 12.2 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 NVIDIA GeForce GTX 1050 Ti Off | 00000000:03:00.0 Off | N/A |
| 28% 34C P0 N/A / 75W | 0MiB / 4096MiB | 0% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
+---------------------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=======================================================================================| | No running processes found |
+---------------------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=======================================================================================| | No running processes found |
here are my docker-compose.yml hwaccel.yml and .env files
here is my .env file as a txt file so it does not need to be downloaded
only thing I changed in the .env is the upload location
It also needs to be enabled in settings on Immich
It is




thank you for testing this. the pr has been mostly done but there's so many hardware configurations it's hard to know if it will work for everyone
can you share your
hwaccel.yml
as well as the microservices logs?
oh, i see you posted it and it looks good. since you have nvenc enabled in settings, i'm guessing the hwaccel is failing and it's falling back to software
do you have nvidia container runtime installed on this server?no I did not, let me see if that fixes the issue
@sogan still no luck but I see this in the logs for the microservers container
and I see that none of the videos are actually being transcoded
Ah, I think this is a bug. Can you add this to your
hwaccel.yml
and redeploy your microservices container?
sure one sec
THAT WORKED!!!
Awesome!

TYSM!
Glad I could help!
Also if you have an Intel CPU, would you be able to try Quick Sync as well?
You don't need to change the
hwaccel.yml
I dont have a linux setup but I have a windows pc would it work with that?
If not I may be able to install linux on it on sunday
Ah, I think quick sync doesn't support WSL2 unfortunately so it would need to be on linux
lmk if you do end up trying it since it's the only backend I can't test atm
ah ok Ill gladly try testing when I have a chance
I also saw that VP9 is not supported on NVEC, doesn't nvidia support VP9 encode and decode?
nope nvm sorry just saw that only decoding is supported not encdoing
yeah I thought it did but unfortunately not. they basically skipped over vp9 since the latest ones support av1.
yeah, shame but I guess expected
I can also pretty much confirm that most amd gpus and APUS do not support VP9 encode
@sogan do you know what command I should use to check if Intel quicksync is being used?
you can check the microservices logs. if the
-vcodec
part says h264_qsv
or hevc_qsv
and there's no errors, it's using qsv@sogan tried it and it works well, nice work !
One concern, all theses hardware acceleration are available for arm devices too ?
No arm support for hwaccel
not for arm itself, but if the device has a gpu then it should work
e.g. jetson
Eg compute module 4 with an external GPU.
@sogan do you also want me to test?
if you can itβd be great. i made a few changes so qsv should work
@sogan I get this error:
do you have this uncommented in the
hwaccel.yml
?
yes I do
hmm, this is tricky. there are two possibilities i'm thinking here:
1. the host system is missing a dependency for something, possibly libmfx or mesa? these are installed in the container but maybe there's something that needs to be installed in the host as well?
2. the qsv device is actually not renderD128 for this system but maybe something else?
could you try the same settings with vaapi? it will internally use quick sync if it's available. also share:
1. the cpu model and whether this server has a gpu.
2. the output of
ls /dev/dri
on this server
btw i managed to get the same error on another machine i tested. the pr has been updated to fix this issue so it should work now.I was thinking of trying out hw transcoding. But my system has an i3 4010u. As per specs it supports intel quick sync but as per documentation min 7th gen intel is needed. So just wanted to confirm if there is any way I can enable it?
You can enable it and see if it works. Let me know how it goes if you do
@sogan Im sorry for the delay, Im now on the latest version and when I try to transcode this happens:
this is my hwaccel.yml file:
this is my docker compose file:
this is my .env file
can you share the host os and version (e.g. Ubuntu 23.04)?
yes it's ubuntu 23.04 server
lol good guess
try changing this line
capabilities: [gpu]
to capabilities: [gpu,video]
hmm didn't seem to fix it @sogan
^ from the immich_microservices container ^
that did seem to fix the issue you had before, but this error is really weird. it normally means the video is corrupt. does software transcoding work?
let me try
@sogan
this is without the gpu
yeah something's up with that video. if you can download
78b41b5a-aeef-4026-b17d-97901e586f36.mp4
to your pc, try playing it. if it plays, this is a bug with ffmpeg/ffprobe. if it doesn't, the video is corrupt.ok but isin't it happening with all the videos with the gpu
i don't know how many of the videos it affects in this library but i think the logs you shared are for one video
but there are several that fail in the last logs you showed so any of those would also work for testing
overall though, this doesn't seem to be a hardware transcoding problem at this point. the error is about it not being able to decode the video, but decoding is always software-based (only encoding is hardware accelerated).
oh ok thanks, Ill try to renable the gpu, leave it overnight and see what happens
@sogan still getting issues with gpu enabled it seems like every video is failing to transcode:



here are my settings
this is the bottom of the error in the immich_microservices container:
i think this is actually a bug with fluent-ffmpeg in node.js
GitHub
mp4 buffer as input fails to convert a video Β· Issue #932 Β· fluent-...
Version information fluent-ffmpeg version: 2.1.2 ffmpeg version: 4.1.3 OS: macos Code to reproduce A buffer comes from browser to node.js read by FileReader. // videoArrayBuffer is a buffer coming ...
what a weird bug