Windows scripts executed with correct X.Y but wrong display

Hi,

We are using Appium webdriver of the dotnet-client to test our Windows desktop application.
Currently we are migrating from version 4.* to 5.1, and switching to the Appium server instead of WAD directly.
I’m grateful that a bug reported earlier has been fixed rapidly earlier but I’m running into a problem.
Before we were using actions to simulate right-click, double-click, for example:

        var action = new Actions(driver);
        action.MoveToElement(element);
        action.DoubleClick();
        action.Perform();

But as they seem no longer supported, we switched to appium-windows-driver commands like the following:

        driver.ExecuteScript("windows: click", new Dictionary<string, object>()
        {
            { "elementId", element.Id },
            { "times", "2" },
        });

When having 2 monitors, the application to test might launch on the right monitor.
When setting the Position to 0.0, it will position the application to 0.0 on the right monitor (no change in behaviour).
When the custom script is executed, the mouse has been positioned to the element X.Y of the left monitor instead of the monitor where the application is running, this was not the case before.

Is this known (and if so a bug?)? What would be the best way to fix this?
Thanks in advance.

Coordinates for these actions are calculated according to rules described in MOUSEINPUT (winuser.h) - Win32 apps | Microsoft Learn.

appium-windows-driver transforms gesture API calls to the following attribute sets: appium-windows-driver/lib/commands/winapi/user32.js at ff86af0dfb3d151caf7dc76de71238c54327e264 · appium/appium-windows-driver · GitHub

After removing the MOUSEEVENTF_VIRTUALDESK in the toMouseMoveInput function the problem seems to disappear and correct monitor is used. However I lack enough knowledge about the code and events to suggest this as a fix without side-effects.