ROG/QAM as L5/R5 or M1/M2

Hi, I just installed bazzite as dual boot on my ASUS ROG Ally X. I play a lot of KBM games like Factorio, Dyson Sphere Program, and Kerbal Space Program. So maximizing available buttons is important to me. Since the features available on ROG and QAM are available as side swipe gestures, I'd like these buttons to show up as the extra paddles on a steam deck or the m1/m2 buttons on DualShock in STEAM Input, so I can map them to other keyboard combos. Is this possible?
29 Replies
antheas
antheas•4mo ago
Should be like that by default when you use dualsense
Progor
ProgorOP•4mo ago
When I use dual sense, then go to test device inputs in steam, ROG activates the Playstation button in the middle, and QAM activates PS + X
antheas
antheas•4mo ago
Oh you want to use those too... I will think about that nobody has asked about it before
Progor
ProgorOP•2mo ago
I understand those 2 buttons are weird with firmware triggering different functions if you hold them. I'm just looking to map short presses to single keys, and understand using them in a chord with other buttons will never work. Switching to beta channel on steam input lets me remap the QAM button. Now it's just the guide button that I can't remap. @antheas I submitted a PR for this request. I'm also planning on mapping a portion of the touchscreen to touchpads. This will make my ROG ally fully compatible with steam deck inputs (minus hold on L5/R5). I'm also considering making the multi-tap a setting instead of environment variable, so it can be set in per-game profiles.
antheas
antheas•2mo ago
lets keep this pr open for now i really dont want to touch that code that much my linter broke because its so complex
Progor
ProgorOP•2mo ago
Yeah, took me a couple of days to grok it and I've been doing this a long time. Would you be up for me taking a stab at simplifying it, even if it turns into a pretty significant rewrite?
antheas
antheas•2mo ago
lets not touch that code if you want this to merge dont dual function the buttons make them just normal extra_l3 and r3 because you will break other devices then hide them under an advanced setting in the dualsense controller also that setting can break those buttons so its dangerous thinking about it probably needs to be an env var
Progor
ProgorOP•2mo ago
Ok, I saw nothing was mapped to extra_l2 and r2 for ROG Ally, and it just shows up in Steam input as L5 and R5 but is not accessible. The only way to get into this state is with the guide_to_paddles setting, which is only available on the rog ally, so I'm not sure how it could conflict with other devices. If other devices have their own way to access L5 and R5 already, they would have no need to enable guide_to_paddles. How would an extra_l3 and r3 get mapped into steam input? Would I end up with an L6 and R6 and still have an inaccessible L5 and R5?
antheas
antheas•2mo ago
i forgot the ally has two back buttons both dualsense edge and steam controller have 4 back buttons so both work you commented out a piece of code
Progor
ProgorOP•2mo ago
That must be the Ally X. I have the OG.
antheas
antheas•2mo ago
try to make it work with all devices and with rog swap and with holding the og also has 2 back buttons
Progor
ProgorOP•2mo ago
Just one here...
No description
antheas
antheas•2mo ago
two as in one left and one right
Progor
ProgorOP•2mo ago
(Per side, that is)
Progor
ProgorOP•2mo ago
The purpose is to get L5 and R5 down here working.
No description
antheas
antheas•2mo ago
in any case, env var + you dont touch ally code + no triple and double and it works with swap
Progor
ProgorOP•2mo ago
Fully tested with swap on this device anyway. The code I commented - i searched everywhere for anything that responds to the special guide event and couldn't find anything that used it. Maybe there are some external plug-ins or something?
antheas
antheas•2mo ago
that used to close the overlay maybe not anymore
Progor
ProgorOP•2mo ago
I'll be happy to put that back in just in case, and leave guide without the double-tap option. Can I assume the QAM can stay like it is since it already had triple tap?
antheas
antheas•2mo ago
no, force remap them to extra_l2 and extra_r2 before any processing and thats it
Progor
ProgorOP•2mo ago
Got it, so no visible preferences. This is an advanced feature for power users like me 🙂 Same for creating virtual touchpads on the touchscreen, I presume?
antheas
antheas•2mo ago
there is a touchpad handler but its so not ready for doing that what you want to do is not going to be easy because youll probably need to do multiple fingers
Progor
ProgorOP•2mo ago
Yep. The plan was to track the first finger in each of the 2 areas I section out for trackpad virtualization, and ignore any others. But I've seen touchscreens in the past that randomly assigned the same finger to different ids each tick, so realize I will probably need to track finger position myself and not rely on the driver to consistently give me the same real finger at the same id. If it does, that just makes it easier.
antheas
antheas•2mo ago
the first finger to touch the screen is assigned the first id
Progor
ProgorOP•2mo ago
So I only have 6 cases: new finger starting in left area, finger entering left area from outside, finger exiting left area, and same 3 for right. I'll have to play with it to see what feels best for entering/exiting... probably projecting to the nearest position inside when you exit, and ignoring fingers that start outside (to avoid confusing a top swipe that veered down into the virtual trackpad area as a trackpad gesture). Either way, I'll keep the changes as minimal as possible and behind an environment variable, and if it turns out to work well and be useful, we can talk about adding settings and making it more accessible. Right now, the logical place to start seems to be in plugins/overlay/controllers.py as an extension of the gesture code to capture the input, and then process it in multiplexer, plus some changes in dualsense to add support for tracking the second finger. Looks like there are constants for left_ and right_touchpad_x and _y, but they aren't implemented yet. But the second offset for the second finger axis is defined on touchpad_touch2, so I know how to send the coordinates. I'm guessing the whole trackpad covers the [0, 1] space and I'll just send x = [0,0.5] on touchpad_touch for the left trackpad, and x = [0.5, 1] on touchpad_touch2 for the right trackpad. If that doesn't work I'll experiment until it does. 🙂
antheas
antheas•2mo ago
gesture code? no, probably youll need to read the touchscreen from the device and grab it you dont have the bandwidth to pass that many events between those
Progor
ProgorOP•2mo ago
Ah good point. I see the overlay/gesture code internally processes the touchscreen events and only emits an event once a gesture is recognized. Flooding the event queue with multiple events for every touchscreen x/y/down/up/click/right_click update would be a significant overhead. Since every handheld has a touchscreen (I think?), I wonder if it makes sense to create a universal touchscreen in controllers/physical/touchscreen.py, and have each device import it, providing the VID/PID to the constructor. This way logic like tap-to-click can be implemented centrally. Either way, I'll start with a specific solution in rog_ally, keeping in mind it might be worth extracting to a more generic location later. Also, I've updated the PR with the straightforward environment variable and direct code replacement. The code for mode/share is replaced after the swap_guide block, so it should meet all your requirements for merging. Let me know if I missed somethihng.
Progor
ProgorOP•2mo ago
@antheas I rebased the guide-to-paddles PR on your latest commits if you're still interested in merging it in. I've also got a first draft of touchscreen trackpads working here: https://github.com/dpwhittaker/hhd/tree/touchscreen-trackpads. What I see right off the bat is to make this generally useful, it's going to need some configuration. The placement and size of the onscreen touchpads would change from person to person depending on hand size and grip style. I'm grabbing the touchscreen, so hhd goes into a crash loop if any shortcuts are configured. So I'm planning to set the area above and below the onscreen trackpad as virtual buttons that bring up the steam, qam, hhd overlays, and keyboard that the current swipe guestures support, but if this were to be released, it would need to disable shortcuts on the overlay. I guess this functionality could be skipped with an environment variable, and this feature could read the shortcuts configuration to decide which feature to put on each virtual button.
GitHub
GitHub - dpwhittaker/hhd at touchscreen-trackpads
Handheld Daemon, a tool for configuring handheld devices. - GitHub - dpwhittaker/hhd at touchscreen-trackpads
Progor
ProgorOP•2mo ago
So I guess these are the options: 1: if you think this is a feature worth including as a first-class feature of HHD, then refactoring touchscreen shortcuts out of the overlay plugin and into the core event loop makes sense. My Touchscreen class would move out of hhd.device.rog_ally and into hhd.controller. We could bring over the touchscreen section of overlay.find_devices instead of hard-coding the VID/PID. Each device would need to specify it's screen resolution instead of hard-coding the screen-to-touchpad transformation. And there should probably be some additional configurations to specify the position and size of the touchpad. 2: if you think this is a power user feature hidden behind environment variables, then the additional config would live in a dedicated file in .config/hhd/touchscreen.yml, or perhaps in the usual state.yml, but would only be editable by hhdctl and not show up in the UI. Touchscreen shortcut code in the overlay plugin would just be completely disabled by the environment variable. Each device would need to implement their own independent solution if people with other devices wanted the functionality. I guess there's a third option - do #1, but behind an environment variable. Once a "critical mass" of devices have been implemented and tested, then remove the environment variable and the code that would then become dead. I'm not sure how to "feature-flag" the configs, so they would probably need to be modified with hhdctl until the feature is ready, then added to the .ymls. I looked at options for building this as a standalone app outside of HHD, but I need to have the touchscreen inputs show up as inputs on the virtual Dualsense controller to activate the Steam Input functionality I'm looking for, so it has to be at least partially integrated into HHD. HHD plugins use the low-bandwidth emit functionality as you pointed out, so that's not an option either. So we either need to agree on an approach you are comfortable with including in the product, or I'm stuck keeping a fork up to date with your changes.

Did you find this page helpful?