Keith Packard <[email protected]> writes: > Eric Anholt <[email protected]> writes: > >> Keith Packard <[email protected]> writes: >> >>> glamor_transfer had some hard-coded depth to format/type conversions >>> which may not match the format used with render. This switches to >>> using the same formats as the render portion of glamor. >> >> glamor_get_tex_format_type_from_pixmap() looks hopelessly broken. It's >> trying to choose the texture's actual channel layout using the >> format/type arguments to TexImage. But GL doesn't say anything about >> format/type implying anything about the resulting texture channel sizes >> or layout -- until ARB_texture_view, it just strongly implies that >> format/type doesn't matter, and with ARB_texture_view and >> ARB_shader_image_load_store it's guaranteed that it doesn't matter. > > If you ignore how the data are actually stored and just say that glamor > should be using a consistent format for pixels going in and coming out, > then it makes sense to track what the last format we used was. If the > driver can support 2 10 10 10, then presumably you're far more likely to > get 2 10 10 10 data if you ask for it. > >> So what you end up doing here is saying "oh, somebody's attached a >> 2101010 picture to my pixmap! Now in PutImage I'll upload my 32 bits of >> data as 2 bits expanded into the 8-bit alpha and 10 bits truncated into >> the 8 bits or r, g, and b." > > Yeah, if it isn't actually attempting to fix the format of the pixmap by > downloading in 8 8 8 8 format and re-uploading in 2 10 10 10 format, > then it's hopelessly broken. However, if we create a pixmap, create a > picture and then upload data, things might work? > >> For bonus brokenness, delete the picture (unsetting >> pixmap_priv->base.picture), and do a GetImage (now using >> format_for_depth, 8888 like before), and your data is trashed because >> you read your actually-8888 data as 8888 without doing any bitshifting >> to partially recover from the 2101010->8888 screwup. > > Right, clearly the pixmap/fbo needs to track the format rather than > using the attached picture for this data. > >> I think instead we want to just rip out the wacky-format support from >> glamor_render.c and the pixmap_priv->picture field, and fall back until >> we get texture_view code. > > We have 2 10 10 10 scanout devices, and I don't really want to just not > support those at all for now; is there anything we can do?
Because our gbm BOs (the only thing we'll use for scanout) are always alocated as 8888, and that dictates the texture format (unlike the format/type arguments to teximage), we'd better be telling GL that our 32-bit pixel data we want to upload is 8888.
pgpUJl_6s4OvT.pgp
Description: PGP signature
_______________________________________________ [email protected]: X.Org development Archives: http://lists.x.org/archives/xorg-devel Info: http://lists.x.org/mailman/listinfo/xorg-devel
