On Mon, Aug 29, 2022 at 07:56:40AM +0100, Hashim Mahmoud wrote: > This iMac's screen works fine, but always turns off when the radeon GPU > driver is loaded on any UNIX (tried Linux Mint and NomadBSD as well). I > had to install OpenBSD on another machine since I was installing to an > SD card, as I didn't have any (big enough) spare SSDs or USBs, and > writing to the SD card from the installer on the same iMac gives me > permission errors. > > I had installed the radeon firmware already by the time I put the SD > card into the iMac. The GPU seems to work fine and I can login and run > commands blindly, but the display is off... until I had the bright > idea to connect a display over the DisplayPort at boot time, via a VGA > to mini DisplayPort adapter. The internal eDP display worked, but the > tty framebuffer took the resolution of the external display (similar to > what would happen when a low resolution laptop is connected to a > higher resolution display, and both displays are on), which is > 1440x900, and the external display receives no output, though Xorg > seems to acknowledge its existence and renders itself as if that > display exists, as the xenodm login window is not available on the main > display, and the mouse disappears as if it went to another display when > hitting the left edge. I can login blind and run `xrandr --output > DisplayPort-0 --off`, and everything seems to work perfectly fine and > GPU accelerated, as the output of `glxinfo -B` indicates, and glxgears > runs smoothly. > > However, this only works if the DisplayPort is connected at boot time: > > - If it is disconnected after the radeon driver is loaded, but X hasn't > started yet, then the display goes black again, but I can plug it back > in and the display shows up as normal (still in 1440x900, though), and > I get this in blue: `[drm] *ERROR* chosen encoder in use 0`. No idea > what that means. > > - Once X is started however, I can just disconnect the DisplayPort, and > life is good :) ... except I get the following in xconsole: > ``` > [drm] *ERROR* displayport link status failed > [drm] *ERROR* clock recovery failed > [drm] *ERROR* displayport link status failed > [drm] *ERROR* clock recovery failed > ``` > - Doesn't seem to effect me in any way, though. > > - I also get this error message, when I switch from my X session to > tty: `[drm] *ERROR* rv770_restrict_performance_levels_before_switch > failed`. Also seems to have no effect. > - Note that these errors only appear in xconsole and not on the actual > console in blue... weird. > > > Is there some sort of software remedy for this? I doubt this (unless > someone can tell me otherwise), so I was thinking of getting something > like a Raspberry Pi Zero or an Arduino, and connecting to the > DisplayPort, making it like some sort of dummy DisplayPort input by > programming it... I'm not sure at all how to go about doing this, as > I'm a very amateur programmer, but I know that it *should* be > theoretically possible; any pointers would be very appreciated.
Does this help? RV730 has DCE3.2 Index: sys/dev/pci/drm/radeon/atombios_encoders.c =================================================================== RCS file: /cvs/src/sys/dev/pci/drm/radeon/atombios_encoders.c,v retrieving revision 1.17 diff -u -p -U6 -r1.17 atombios_encoders.c --- sys/dev/pci/drm/radeon/atombios_encoders.c 24 Feb 2022 12:49:47 -0000 1.17 +++ sys/dev/pci/drm/radeon/atombios_encoders.c 29 Aug 2022 07:12:12 -0000 @@ -2189,13 +2189,14 @@ int radeon_atom_pick_dig_encoder(struct /* * On DCE32 any encoder can drive any block so usually just use crtc id, * but Apple thinks different at least on iMac10,1, so there use linkb, * otherwise the internal eDP panel will stay dark. */ if (ASIC_IS_DCE32(rdev)) { - if (dmi_match(DMI_PRODUCT_NAME, "iMac10,1")) + if (dmi_match(DMI_PRODUCT_NAME, "iMac10,1") || + dmi_match(DMI_PRODUCT_NAME, "iMac11,2")) enc_idx = (dig->linkb) ? 1 : 0; else enc_idx = radeon_crtc->crtc_id; goto assigned; }