Very interesting.
I believed autovideosink was the endpoint of a pipeline and you could link it to a surface or widget (like qvideowidget) so that it would render the media there.

Now, as suggested, I created an appsink and I linked it to the qvideowidget with


video_widget->setAttribute(Qt::WA_NativeWindow, true);
WId win_id = video_widget->winId();
QApplication::sync();
gst_x_overlay_set_window_handle(GST_X_OVERLAY(data->appsink), win_id);


but it still outputs the video on a new window.
Can you explain better how to render the video from an appsink to a qvideowidget?
Especially the part of the paintEvent.

Before the app sink, the pipeline contains: videoscale and ffmpegcolorspace

Thank you for your answer though.
Marko