Re: CDF meeting @FOSDEM report

2013-02-06 Thread Alex Deucher
On Wed, Feb 6, 2013 at 6:11 AM, Tomi Valkeinen tomi.valkei...@ti.com wrote:
 Hi,

 On 2013-02-06 00:27, Laurent Pinchart wrote:
 Hello,

 We've hosted a CDF meeting at the FOSDEM on Sunday morning. Here's a summary
 of the discussions.

 Thanks for the summary. I've been on a longish leave, and just got back,
 so I haven't read the recent CDF discussions on lists yet. I thought
 I'll start by replying to this summary first =).

 0. Abbreviations
 

 DBI - Display Bus Interface, a parallel video control and data bus that
 transmits data using parallel data, read/write, chip select and address
 signals, similarly to 8051-style microcontroller parallel busses. This is a
 mixed video control and data bus.

 DPI - Display Pixel Interface, a parallel video data bus that transmits data
 using parallel data, h/v sync and clock signals. This is a video data bus
 only.

 DSI - Display Serial Interface, a serial video control and data bus that
 transmits data using one or more differential serial lines. This is a mixed
 video control and data bus.

 In case you'll re-use these abbrevs in later posts, I think it would be
 good to mention that DPI is a one-way bus, whereas DBI and DSI are
 two-way (perhaps that's implicit with control bus, though).

 1. Goals
 

 The meeting started with a brief discussion about the CDF goals.

 Tomi Valkeinin and Tomasz Figa have sent RFC patches to show their views of
 what CDF could/should be. Many others have provided very valuable feedback.
 Given the early development stage propositions were sometimes contradictory,
 and focused on different areas of interest. We have thus started the meeting
 with a discussion about what CDF should try to achieve, and what it 
 shouldn't.

 CDF has two main purposes. The original goal was to support display panels in
 a platform- and subsystem-independent way. While mostly useful for embedded
 systems, the emergence of platforms such as Intel Medfield and ARM-based PCs
 that blends the embedded and PC worlds makes panel support useful for the PC
 world as well.

 The second purpose is to provide a cross-subsystem interface to support video
 encoders. The idea originally came from a generalisation of the original RFC
 that supported panels only. While encoder support is considered as lower
 priority than display panel support by developers focussed on display
 controller driver (Intel, Renesas, ST Ericsson, TI), companies that produce
 video encoders (Analog Devices, and likely others) don't share that point of
 view and would like to provide a single encoder driver that can be used in
 both KMS and V4L2 drivers.

 What is an encoder? Something that takes a video signal in, and lets the
 CPU store the received data to memory? Isn't that a decoder?

 Or do you mean something that takes a video signal in, and outputs a
 video signal in another format? (transcoder?)

In KMS parlance, we have two objects a crtc and an encoder.  A crtc
reads data from memory and produces a data stream with display timing.
 The encoder then takes that datastream and timing from the crtc and
converts it some sort of physical signal (LVDS, TMDS, DP, etc.).  It's
not always a perfect match to the hardware.  For example a lot of GPUs
have a DVO encoder which feeds a secondary encoder like an sil164 DVO
to TMDS encoder.

Alex
--
To unsubscribe from this list: send the line unsubscribe linux-media in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html


Re: CDF meeting @FOSDEM report

2013-02-06 Thread Tomi Valkeinen
On 2013-02-06 16:44, Alex Deucher wrote:
 On Wed, Feb 6, 2013 at 6:11 AM, Tomi Valkeinen tomi.valkei...@ti.com wrote:

 What is an encoder? Something that takes a video signal in, and lets the
 CPU store the received data to memory? Isn't that a decoder?

 Or do you mean something that takes a video signal in, and outputs a
 video signal in another format? (transcoder?)
 
 In KMS parlance, we have two objects a crtc and an encoder.  A crtc
 reads data from memory and produces a data stream with display timing.
  The encoder then takes that datastream and timing from the crtc and
 converts it some sort of physical signal (LVDS, TMDS, DP, etc.).  It's

Isn't the video stream between CRTC and encoder just as physical, it
just happens to be inside the GPU?

This is the case for OMAP, at least, where DISPC could be considered
CRTC, and DSI/HDMI/etc could be considered encoder. The stream between
DISPC and DSI/HDMI is plain parallel RGB signal. The video stream could
as well be outside OMAP.

 not always a perfect match to the hardware.  For example a lot of GPUs
 have a DVO encoder which feeds a secondary encoder like an sil164 DVO
 to TMDS encoder.

Right. I think mapping the DRM entities to CDF ones is one of the bigger
question marks we have with CDF. While I'm no expert on DRM, I think we
have the following options:

1. Force DRM's model to CDF, meaning one encoder.

2. Extend DRM to support multiple encoders in a chain.

3. Support multiple encoders in a chain in CDF, but somehow map them to
a single encoder in DRM side.

I really dislike the first option, as it would severely limit where CDF
can be used, or would force you to write some kind of combined drivers,
so that you can have one encoder driver running multiple encoder devices.

 Tomi




signature.asc
Description: OpenPGP digital signature


Re: [Linaro-mm-sig] CDF meeting @FOSDEM report

2013-02-06 Thread Daniel Vetter
On Wed, Feb 6, 2013 at 4:00 PM, Tomi Valkeinen tomi.valkei...@ti.com wrote:
 not always a perfect match to the hardware.  For example a lot of GPUs
 have a DVO encoder which feeds a secondary encoder like an sil164 DVO
 to TMDS encoder.

 Right. I think mapping the DRM entities to CDF ones is one of the bigger
 question marks we have with CDF. While I'm no expert on DRM, I think we
 have the following options:

 1. Force DRM's model to CDF, meaning one encoder.

 2. Extend DRM to support multiple encoders in a chain.

 3. Support multiple encoders in a chain in CDF, but somehow map them to
 a single encoder in DRM side.

4. Ignore drm kms encoders.

They are only exposed to userspace as a means for userspace to
discover very simple constraints, e.g. 1 encoder connected to 2
outputs means you can only use one of the outputs at the same time.
They are completely irrelevant for the actual modeset interface
exposed to drivers, so you could create a fake kms encoder for each
connector you expose through kms.

The crtc helpers use the encoders as a real entity, and if you opt to
use the crtc helpers to implement the modeset sequence in your driver
it makes sense to map them to some real piece of hw. But you can
essentially pick any transcoder in your crtc - final output chain for
this. Generic userspace needs to be able to cope with a failed modeset
due to arbitrary reasons anyway, so can't presume that simply because
the currently exposed constraints are fulfilled it'll work.

 I really dislike the first option, as it would severely limit where CDF
 can be used, or would force you to write some kind of combined drivers,
 so that you can have one encoder driver running multiple encoder devices.

Imo CDF and drm encoders don't really have that much to do with each
another, it should just be a driver implementation detail. Of course,
if common patterns emerge we could extract them somehow. E.g. if many
drivers end up exposing the CDF transcoder chain as a drm encoder
using the crtc helpers, we could add some library functions to make
that simpler.

Another conclusion (at least from my pov) from the fosdem discussion
is that we should separate the panel interface from the actual
control/pixel data buses. That should give us more flexibility for
insane hw and also directly exposing properties and knobs to the
userspace interface from e.g. dsi transcoders. So I don't think we'll
end up with _the_ canonical CDF sink interface anyway.
-Daniel
-- 
Daniel Vetter
Software Engineer, Intel Corporation
+41 (0) 79 365 57 48 - http://blog.ffwll.ch
--
To unsubscribe from this list: send the line unsubscribe linux-media in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html