Jan Grulich

How to use libportal/libportal-qt

There was a blog post from Peter Hutterer about Flatpak portals posted few months back. Peter explained what are portals and how do they work. Portals are used mostly because of security and sandbox/Wayland restrictions. Many times your only way to get access outside (opening a file, sending a notification, sharing a screen, etc.) is to use a portal. For most use-cases applications or developers don’t need to care about them as their support is usually implemented in libraries they use. For example Qt and GTK use portals internally so apps can use still the same APIs as before and they don’t need to worry about their apps not working in sandboxed environments. BUT there are still scenarios where libraries have unsufficient or none portal support, or a different options are desired so what are the options in this case if you still need to use portals?

  1. Do everything yourself, which means you will implement all the DBus calls and handling yourself.
  2. Use a library. Most logic choice would be libportal, but there is also a project called ASHPD for Rust users.

What is libportal and libportal-qt?

The libportal library provides GIO-style async APIs for Flatpak portals. It hides all the DBus complexity users would face in case of using portals directly and provides a user-friendly library instead. You might think that the libportal-qt is the same thing, just with Qt-style APIs, but the idea behind it is that each toolkit (Gtk3, Gtk4, Qt5, Qt6) has a different way to get a window handle which is needed to associate portal dialogs with the app that invoked them. So libportal-qt just provides a way to get a XdpParent object from a QWindow. As a C++/Qt developer I don’t mind using C/Glib APIs and I used it many times, but there is still one speciality I fail to use everytime, my friend GVariant. Some of the portal APIs in libportal expects a GVariant for all the complex structures, for example to specify a filter option for OpenFile() call from the fillechooser portal, you have to build a very complex GVariant based on the DBus specification.

Remember I told you libportal-qt doesn’t offer Qt-style APIs? This is not necessarily true, because I implemented all the complex structures you will have to pass in most of the portals and implemented functions that will return them as GVariants so you don’t need to get in touch with GVariants at all.

How to use libportal-qt?

First of all, all libportal flavours have pkgconfig file installed so it’s easy to use them from any build system and you just need to search for libportal-qt5 (we don’t have -qt6 version yet).

And how does the code look like? For example let’s say you want to open an image:

// Creates a filter rule, this can be a Mimetype or Pattern.
XdpQt::FileChooserFilterRule rule;
rule.type = XdpQt::FileChooserFilterRuleType::Mimetype;
rule.rule = QStringLiteral("image/jpeg");

// Create a filter with our rules, we will then pass it to OpenFile() call as GVariant.
XdpQt::FileChooserFilter filter;
filter.label = QStringLiteral("Images");
filter.rules << rule;

// Create a GVariant from our filter. This will result into variant in form of:
// "[('Images', [(1, 'image/jpeg')])]"
g_autoptr(GVariant) filterVariant = XdpQt::filechooserFiltersToGVariant({filter});

// Get XdpParent to associate this call (portal dialog) with our window.
XdpParent *parent = xdp_parent_new_qt(m_mainWindow->windowHandle());

// Finally open a file. XdpQt::globalPortalObject() is another convenient function 
// that creates a global instance of XdpPortal object so you don't need to take care
// of creating it yourself. For some of the arguments we just pass a nullptr to don't 
// specify them.
xdp_portal_open_file(XdpQt::globalPortalObject() /*XdpPortal object*/,
                                  parent /*XdpParent object*/, "Title", filterVariant /*filters*/,
                                  nullptr /*current_filter*/, nullptr /*choices*/, 
                                  XDP_OPEN_FILE_FLAG_NONE /*flags*/, nullptr /*cancellable*/, 
                                  openedFile /*callback*/, this /*data*/);
xdp_parent_free(parent);

// Then the callback would look like this, eg.
static void openedFile(GObject *object, GAsyncResult *result, gpointer data) {
    g_autoptr(GError) error;
    g_autoptr(GVariant) ret = 
        xdp_portal_open_file_finish(XdpQt::globalPortalObject(), result, &error);

    if (ret) {
        // Another convenient function that will get you uris and choices from 
        // GVariant returned by xdp_portal_open_file() call.
        XdpQt::FileChooserResult result = filechooserResultFromGVariant(ret);
        
        // Do whatever you want to do with the result. Here we just print opened selected files.
        qDebug() << result.uris;
    }
}

As you can see, no GVariant got hurt and you can easily open a file without any GVariant knowledge. Besides FileChooser portal helpers, we also have Notification portal helpers, because serializing icons and buttons is also something that is not trivial. For the rest of the portals you either don’t need to use complex GVariants so you can use them easily without helper functions same way as shown above, or some portals like ScreenCast or RemoteDesktop are not used that often and we don’t have helper functions for those just yet.

I hope you can find this helpful in case you want to join this world. The libportal project is hosted on GitHub in case you want to try it just now, because this is still not part of any stable release (will be in libportal 0.6), or report a bug or just look at my GVariant helpers to see what I spare you of.

DMA-BUF support in WebRTC

It will be almost three years since we landed initial support for screensharing on Wayland with the use of PipeWire in the WebRTC project. This enabled screensharing support in both major Linux browsers. Last year I implemented support for window sharing, added support for PipeWire 0.3 and added support for DMA-BUF and MemFD buffer types. Problem was, as it turned out, the DMA-BUF support was not implemented in a correct way.

The original implementation was using mmap() to get the buffer content. This worked correctly for current Intel GPUs, but was terrifically slow on e.g. AMD GPUs. Proper solution is to use OpenGL context to get the content from buffer. However, there were many implementations using mmap() already, including WebRTC and we needed a way how to properly communicate between the server and the client that when the client advertises DMA-BUF support, it means it doesn’t use mmap() and goes through OpenGL context instead.

Here are some issues if you want to read about the details:

This all resulted into a completely different way how the communication between the consumer and the producer should happen in order to use DMA buffers for way faster and smoother screensharing support. Both sides are now required to query the list of all supported modifiers and add this as a new stream parameter, including flags that the modifiers are mandatory parameter, rest of stream parameters are kept as before so we can keep using other types in case DMA-BUFs are not supported by the producer. Once both sides matches their expectations, we can query whether the stream includes modifiers, based on that we know we can use DMA buffers, which we now properly open using OpenGL context, while we kept mmap() for MemFd buffer types as fallback. This will result into faster screensharing support in your web browsers.

Last but not least, I made screensharing even faster, regardless of buffer type we use. Originally when we received buffer from PipeWire, we copied it to a local variable so we can apply cropping and adjust the position and only after that we copied this adjusted content into a DesktopFrame, which each DesktopCapturer (a class representing screensharing implementation) is supposed to return and let it be displayed by the browser. That means we performed two copy operations for each frame. I improved this implementation and now we copy the PW buffer content directly to a desktop frame which we can return directly so one copy operation less than before. I didn’t do some exact measures, but simply running htop and comparing usage of top 5 processes when sharing a 4k screen I got:

  • Original result: 66%, 64%, 26% 23%, 10%
  • Updated result: 41%, 39%, 19%, 17%, 12%

I also have some other improvements on my TODO list, all of them should bring some additional optimizations and improvements. I will keep you informed once I have news to share with you.

Both changes have been merged into WebRTC, that means it should be in Chrome/Chromium 96 (released during November 2021).

HighContrast variants for Adwaita-qt

In the past we used to have a completely different project to cover HighContrast variants of GTK Adwaita theme. This was all implemented as Highcontrast-qt, a project nobody has touched for 6 years. You can imagine how it looks like these days when you compare it to what we have now. I think even GTK variant of HighContrast was a completely separate theme back then, while today days it’s just Adwaita with a different set of colors.

Since GTK made the new HighContrast theme with just few modifications to the original Adwaita theme, I decided to use same approach and have Adwaita-qt to provide all four variants as well (Adwaita, Adwaita-dark, HighContrast and HighContrastInverse). While this looks like a simple thing to do, as you just need to add additional color palette, it was a pain to do it in Adwaita-qt. The reason is that Adwaita-qt is full of hardcoded color definitions, where all of them were randomly taken from GTK Adwaita stylesheets. Everytime something changed in GTK Adwaita, we would have to manually pick the change and replace the changed color value on our side. This was not really sustainable, especially when I wanted to have four different variants.

To improve this situation and make my life easier, I decided to bundle GTK stylesheets for Adwaita theme, have it processed and write a simple parser to make everything automated. And I did exactly that. I included the stylesheets and wrote some definitions myself so I can let them processed with sassc (GTK uses SASS for theming) and have all my definitions substituted for simple parsing. I no longer have to pick all base colors manually, all of them are being parsed for all four variants and same goes for basic styling of Buttons, CheckBoxes and Radio Buttons, where each of them have all kind of possible states (active, hovered, checked etc.) and use not only a simple color, but also gradients. You can imagine how hard it was to hardcode all values for each state. The parser I wrote is really basic and simple with use of regular expressions as the code I’m trying to process is not that complex.

The code I try to process is either a simple definition:

@define-color base_color #ffffff;

Or widget definition in this form:

button:checked { color: #2e3436; border-color: #cdc7c2; background-image: image(#dad6d2); box-shadow: none; }

The result is:

Adwaita-qt: Adwaita variant
Adwaita-qt: HighContrast variant

I think this is a big step forward for Adwaita-qt and it will allow me to more quicky respond on changes happening in GTK Adwaita/HighContrast theme. I can also imagine this being extended in the future to support some additional variants, like modified Adwaita theme you can find in Ubuntu (at least if the stylesheet is similar enough). As mentioned, Adwaita-qt now supports four variants and they should be on par with GTK 4.4, at least when it comes to colors and style for most used widgets, because there are still places in Adwaita-qt which need some extra work. Anyway, this all is now released as Adwaita-qt 1.4.0 and I will be updating Flathub and Fedora builds to it soon.

Refreshed UI for Fedora Media Writer

For those who don’t know, Fedora Media Writer is a tool to create bootable live USB drive with your favorite flavor of Fedora. It is written in C++ with UI written in QML and it is supported on Linux, Windows and Mac OS X. It was developed by Martin Bříza, my former collegue from Red Hat, who did an amazing job in the past. Fedora Media Writer (FMW) primarily targets Fedora Workstation and therefore the UI looks like a GNOME app using Adwaita theme. Unfortunately the Adwaita theme changed over time and originally FMW was written using QtQuickControls 1 (deprecated these days) so it needed an UI overhaul.

I started working on FMW during the summer, slowly migrating it to QtQuickControls 2. The original UI had lots of custom QML widgets, basically standard widgets with Adwaita skin on it. I still wanted FMW to use Adwaita theme, because Qt doesn’t have any native QML components for Windows, Mac OSX or GNOME and writing those would require lots of work. Therefore I decided to write a new QQC2 based Adwaita theme which can be used on all platforms. To avoid duplicating half of the code we already have in Adwaita-qt (a QStyle to make QWidgets look like Adwaita), like information about widget sizes and colors, I reworked Adwaita-qt to provide a library so it can be used by projects like this and so they don’t need to update everytime Adwaita changes. It was more work than I anticipated because it needed quite a lot of changes to separate things into library and also to make it build and work on all platforms where I want to use FMW. Good news is that the work is now done and I made a pre-release of Adwaita-qt. The library for now provides information about widget sizes, color palette and colors used by all widgets, but I plan to extend this in future with addition of Adwaita-qt rendering part allowing the library to render basic widgets for you. That’s something I would like to use for example in QGnomePlatform (GNOME platform theme) to render buttons in window decorations. With a lot of information being already said about Adwaita-qt, the work on QQC2 Adwaita theme was an interesting experience and probably the most enjoyable one, because everytime you write a new component and port the app to use it, you see the result of your work and the app slowly migrating towards a more modern UI makes you happy with the result. I don’t know what more to say about the QQC2 Adwaita theme as it’s basically QML variant of widgets we have in Adwaita-qt, with difference that it should look exactly the same on all platforms thanks to using Adwaita-qt. In past with QQC1 all the colors were derived from system QPalette making it slightly different on all platforms. If you wonder why the QQC2 theme is not part of Adwaita-qt, where it will most likely end up, then it’s because it’s not complete yet and contains only components used in FMW itself. Anyway, I have finished the port to QQC2 this week with some late fixes and after I spent a week updating all build systems (Windows, Github CI, Mac OSX) to properly build and produce builds for you to test since I made a new pre-release yesterday \o/.

The work on this port is most likely not 100% finished as I expect some minor issues to appear here and there, but I tried to make this 1:1 copy of the previous version so don’t expect any major changes. I will be glad if you try it and let me know what you think. Thank you and especially big thanks goes to Martin Bříza for his help during the development and for the work he did on this project in the past.

You can get it from following locations:

Here you have some images for comparison:

Cute Qt applications in Fedora Workstation

Fedora Workstation is all about Gnome and it has been since the beginning, but that doesn’t mean we don’t care about Qt applications, the opposite is true. Many users use Qt applications, even on Gnome, mainly because many KDE/Qt applications don’t have adequate replacement written in Gtk or they are just used to them and don’t really have reason to switch to another one.

For Qt integration, there is some sort of Gnome support in Qt itself, which includes a platform theme reading Gnome configuration, like fonts and icons. This platform theme also provides native file dialogs, but don’t expect native look of Qt applications. There used to be a gtk2 style, which used gtk calls directly to render natively looking Qt widgets, but it was moved from qtbase to qt5-styleplugins, because it cannot be used today in combination with gtk3.

For reasons mentioned above, we have been working on a Qt style to make Qt applications look natively in Gnome. This style is named adwaita-qt and from the name you can guess that it makes Qt applications look like Gtk applications with Adwaita style. Adwaita-qt is actually not a new project, it’s been there for years and it was developed by Martin Bříza. Unfortunately, Martin left Red Hat long time ago and since then a new version of Gnome’s Adwaita was released, completely changing colors and made the Adwaita theme look more modern. Being the one who takes care of these things nowadays, I started slowly updating adwaita-qt to make it look like the current Gnome Adwaita theme and voilà, a new version was released after 3 months of intermittent work. You can see the results here:

Isn’t it beatiful? The theme is far from being perfect and there will definitely be still some minor issues, but writting a Qt style is far from being an easy task as the QStyle class is quite complex. If you find any issue, you can open a bug and I will try to fix it. You can also send me patches if you decide to fix something yourself (I will be happy for that). The repository is hosted on GitHub under FedoraQt/adwaita-qt.

And of course it was a lie, the screenshots above are the old version of adwaita-qt (for comparison), the new ones are actually here:

I hope you like it more now :).

Tutorial: Screen Sharing and Remote Desktop on Fedora Workstation 30 (Wayland)

I recently got an email from a user asking me how to make all this work on Fedora. Problem is that unlike in old XServer sessions, there are certain things which need to be enabled first. There are also dependencies which need to be installed and services which need to be running. While most of the dependencies are automatically installed and services automatically activated, there still might be situations when this is not true, for example when switching from another desktop so it’s better to cover it all. This tutorial targets Fedora, but it can be probably used by any other distribution.

Dependencies

Both screen sharing and remote desktop work almost identically on Wayland, they both use portals as a communication tool between applications and compositor (in this case Mutter) to start the process of sharing and setup PipeWire stream (see below). While portals were primarily meant to be used by sandboxed applications (e.g. Flatpak) to get access to system (like files or printing) outside the sandbox, their design perfectly fits for Wayland usage too. In Fedora you should have portals automatically installed, they are represented by two separate packages, first is xdg-desktop-portal, which is the portal service communicating with sandboxed applications and with a backend implementation of portals, and the second package is the backend implementation, in our case xdg-desktop-portal-gtk. Both are DBus activatable, which means they are automatically started whenever application calls them. The reason why portals consist from two services is that there can be multiple backends, each one providing native dialogs for your desktop. For example you don’t get a gtk dialog to open a file in KDE Plasma session or you want a backend communicating with specific compositor (like in our case with Mutter).

The second important dependency is PipeWire. PipeWire is the core technology used for screen content delivery from the compositor to applications. This is done throught a PipeWire stream shared between the compositor and application. PipeWire should be automatically installed on your system, the package name is pipewire and it provides socket-based activation so you shouldn’t need to worry if it’s running or not.

Enabling screen sharing and remote desktop in Gnome

You don’t seem to do any additional step in order to make screen sharing work. However, you need to enable remote desktop (if you want to). Go and open gnome-control-center (Settings) and there go to Sharing section. There you should see this window when you click on Screen Sharing:

If you don’t have such option, make sure you have installed gnome-remote-desktop, because I’m not sure whether it’s installed by default. Allowing screen sharing will start a server instance which you can connect to from another computer, using vnc://linux.local or your_computer_ip:5900. You will most likely need to open a hole to your firewall, but the same you need to do for any other VNC server. I tried to connect with Vinagre and Krdc (KDE VNC viewer) and both worked for me, but I was unable to connect with Tigervnc (vncviewer) due to not maching security type.

Screen sharing in Firefox

Firefox in Fedora already comes with PipeWire support enabled and you don’t need to do anything special. You can test it for example with this testing page. The PipeWire support in Firefox unfortunately needs to be enabled during build by handmade changes, which is most likely not happening in other distributions, but from what I now there is an ongoing effort to make this configurable with a build option.

Screen sharing in Chromium/Chrome

Similar situation is with Chromium, where PipeWire needs to be also enabled during build, but it’s already configurable via a build option. In Fedora we have this enabled by default. Official Chrome builds are build with PipeWire support enabled as well. I should maybe mention that PipeWire support is in Chromium starting with version 74.

Unlike with Firefox, this support needs to be also enabled in runtime. You can do that with following chrome flag:

chrome://flags/#enable-webrtc-pipewire-capturer

Then you should be all set to be able to share a screen or a window from your Chromium or Chrome.

Issues

Don’t get scared by higher number of dialogs for screen/windows selection you will get when sharing screen in your browser. We are aware of this annoying usability issue and hopefully we will manage to solve it one day. The reason why this happens is that every browser provide their dialog for screen/window selection and in both browsers these dialogs show previews of your selection. You need to select screen/window first in the portal dialog for the preview dialog and once you accept the preview dialog in your browser, you again need to select screen/window in the portal dialog to get the content into the web page itself.

Support in other applications

There is a KDE application called Krfb, which in the next release (19.08) will have similar support for remote desktop on Wayland as you have in Gnome. Otherwise there are probably no other applications which would allow you to share a screen or control your desktop remotely on Gnome Wayland sessions. You will not be able to use TeamViewer, Tigervnc or any other application you are used to use. If you want to use these applications, you will have to switch to X session for now.

How to enable and use screen sharing on Wayland

Two days ago I wrote about our work on screen sharing in web browsers. While there was a lot of work done recently on this area, it’s not still in the state where everything would just work out of the box. There are few steps necessary to make this work for you and here is a brief summary what you need. This is not a distro specific how to, but given I use Fedora 28 and I know that everything you need is there, it’s most likely you will need to figure out the differences for your distribution or build it yourself.

PipeWire

PipeWire is the core technology used behind all of this. In Fedora you just need to install it, it’s available for Fedora 27 and newer. Once PipeWire is installed, you can just start it using “pipewire” command. If you want to see what’s going on, you can use “PIPEWIRE_DEBUG=4 pipewire”  to start PipeWire with debug information. For Fedora 29, there is a feature planned for PipeWire which should make it to start automatically.

Xdg-desktop-portal and xdg-desktop-portal-[kde,gtk]

We use xdg-desktop-portal (+ backend implementation) for communication between the app requesting to share a screen and between desktop (Plasma or Gnome). You need xdg-desktop-portal, which is the middle man between the app and backend implementation, compiled with screencast portal. This portal will be build automatically when PipeWire is present during the build. In Fedora you should be already covered when you install it. For backend implementation, if you are using Plasma, you need xdg-desktop-portal-kde from Plasma 5.13.x, again compiled with screencast portal, which is build when PipeWire is present. For Fedora 28+, you can use this COPR repository and you are ready to go. I highly recommend using Plasma 5.13.2, where I have some minor fixes and if you have a chance, try to compile upcoming 5.13.3 version from git (Plasma/5.13 branch), as I rewrote how we connect to PipeWire. Previously our portal implementation worked only when PipeWire was started first, now it shouldn’t matter. If you use Gnome, you can just install xdg-desktop-portal-gtk from Fedora repository or build it yourself. You again need to build screencast portal.

Enabling screen sharing in your desktop

Both Plasma and Gnome need some adjustments to enable screen sharing, as in both cases it’s an experimental feature. For Gnome you can follow this guide, just enable screen-cast feature using gsettings. For Plasma, you need to get KWin from Plasma 5.13.x, which is available for Fedora in the COPR repository mentioned above. Then you need to set and export KWIN_REMOTE=1 env variable before KWin starts. There is also one more thing needed for Gnome at this moment, you need to backport this patch to Mutter, otherwise it won’t be able to match PipeWire stream configuration with the app using different framerate, e.g. when using Firefox.

Edit: It seems that exporting KWIN_REMOTE=1 is not necessary, it probably was only during the time when this feature was not merged yet. Now it should work without it. You still need KWin from Plasma 5.13.

Start with screen sharing

Now you should be all set and ready to share a screen on Gnome/Plasma Wayland session. You can now try Firefox for Fedora 28 or Rawhide from this COPR repository. For Firefox there is a WebRTC test page, where you can test this screen share functionality. Another option is to use my  test application for Flatpak portals or use gnome-remote-desktop app.

Edit: I didn’t realize that not everyone knows about xdg-desktop-portal or PipeWire, below are some links where you can get an idea what is everything about. I should also mention that while xdg-desktop-portals is primarily designed for flatpak, its usage has been expanded over time as it perfectly makes sense to use it for e.g. Wayland, where like in sandbox, where apps don’t have access to your system, on Wayland apps don’t know about other apps or windows and communication can by done only through compositor.

 

Scroll To Top