Re: Video standards

2024-04-05 Thread salsaman
On Fri, 5 Apr 2024 at 12:57, Pekka Paalanen 
wrote:

> On Fri, 5 Apr 2024 08:28:27 -0300
> salsaman  wrote:
>
> > I don't think you are paying enough attention to the main points. Ir is
> not
> > simply a case of extending the fourcc values to include more. If I didn't
> > make it clear enough, the whole fourcc system is obscure, inadequate,
> > ambiguous. The only reason ever to use it would be when you don't have
> meta
> > data and you are forced to encode the format in the first 4 bytes.
>
> Right. You must be talking about some other fourcc system. There are
> many of them, and some combine multiple orthogonal things into a single
> enumeration, which then becomes very difficult to extend and work with.
>
> drm_fourcc.h is not one of those.
>

I am talking about any system which tries to enumerate palettes (pixel
formats) in four bytes in a non sequential way.
In my own system (Weed) for example, all RGB palettes are in the range 1 -
511, yuv palettes are 512 - 1023, alpha are 1024 +

In fact this header is enough to define every possible palette, there are
standard enumerations for the most commonly used palettes, and
advanced palettes allows for the composition of new ones. In there also I
have symbolic names for gamma types and yuv details,

interlacing and flags for pre-posr alpha are kept in another header,




>
> Metadata is always necessary anyway, either implied or explicit.
>

Exactly, so I don't know why you keep mentioning fourcc as if it were some
kind of complete solution.


>
> > Colorimetry is only relevant when displaying on a monitor. In the video
> > world we just have red, green and blue (plus alpha, y, u and v). These
> are
> > just labels for the colour channels, mapping them to bit formats.
>
> That is a very surprising opinion. Have you worked on HDR imagery?
> Or wide color gamut?
>

As I have mentioned several times, these are display output parameters,
 The only details which are relevant are the yuv/rgb conversion constants
and the gamma transfer values, With those I can convert berween any two
formats, which is all that is necessary for the steps between decoding and
encoding / display.




>
> > The values I mentioned are all necessary if you want to convert from one
> > colourspace to another. For example if I decode a video frame and the pix
> > format is YUV420P then to convert it to RGBA to display via openGL, I
> need
> > to know the YUV subspace (bt709 or itu601) and whether the values are
> > clamped or full range. Then I apply the standard conversion factors (Kr =
> > 0.2126, Kb = 0.0722 for bt709). This cannot be derived from the fourcc
> > (generally). No doubt there is a standard definition of definition of the
> > R,G,B primaries, but that isnr a concern.  I just feed the values into an
> > openGL texture buffer, and SDL buffer, a gdkpixbuf, QImage or whatever
> and
> > ask for it to be displayed. Now in an application I may optionally offer
> > the user filters to adjust the white balance, contrast, display gamma
> etc.
> > but that is outside of the scope of what I am proposing.
>
> Yes, those are all important properties, and not enough.
>
> Let's just say that the final display output is out of scope, what else is
missing ?
Pre / post alpha is required for conversion between formats, I hadn't
mentioned that because I was trying to avoid going into every little detail.




> > And no, it is not a case of "adding another standard" and confusing
> things,
> > there is no standard.
>
> There are standards. ITU-T H.273, coding-independent code points, for
> example. That combines well with drm_fourcc.h. Also ICC combines well
> with drm_fourcc.h. This works, because drm_fourcc.h does not attempt to
> define anything outside of the memory layout and abstract channels.
>
> Sorry what I meant is there are standards on paper, but there is no
standard set of enumerations (implementation vs specification).
Instead we have multiple implementations, each with their own definitions.
In fact somewhere above I actually linked to the ITU709 standard.




> > I just had a look at pipewire, there is nothing bad about it per se, they
> > mention their palette values are based on gstreamer. So fine, we have yet
> > another library specific set of definitions.
> >
> > It's like I am trying to invent Esperanto, and all you can say is"oh
> > you don't like English, well have you considered speaking German instead
> ?"
>
> That does seem like an apt analogue.
>
> >
> > Well that is it, I am done. I was asked how XDG video could be useful. I
> > explained the shortcomings of what exists currently, and outlined various
> > ways in which having a st

Re: Video standards

2024-04-05 Thread salsaman
I don't think you are paying enough attention to the main points. Ir is not
simply a case of extending the fourcc values to include more. If I didn't
make it clear enough, the whole fourcc system is obscure, inadequate,
ambiguous. The only reason ever to use it would be when you don't have meta
data and you are forced to encode the format in the first 4 bytes.

Colorimetry is only relevant when displaying on a monitor. In the video
world we just have red, green and blue (plus alpha, y, u and v). These are
just labels for the colour channels, mapping them to bit formats.

The values I mentioned are all necessary if you want to convert from one
colourspace to another. For example if I decode a video frame and the pix
format is YUV420P then to convert it to RGBA to display via openGL, I need
to know the YUV subspace (bt709 or itu601) and whether the values are
clamped or full range. Then I apply the standard conversion factors (Kr =
0.2126, Kb = 0.0722 for bt709). This cannot be derived from the fourcc
(generally). No doubt there is a standard definition of definition of the
R,G,B primaries, but that isnr a concern.  I just feed the values into an
openGL texture buffer, and SDL buffer, a gdkpixbuf, QImage or whatever and
ask for it to be displayed. Now in an application I may optionally offer
the user filters to adjust the white balance, contrast, display gamma etc.
but that is outside of the scope of what I am proposing.

And no, it is not a case of "adding another standard" and confusing things,
there is no standard.

I just had a look at pipewire, there is nothing bad about it per se, they
mention their palette values are based on gstreamer. So fine, we have yet
another library specific set of definitions.

It's like I am trying to invent Esperanto, and all you can say is"oh
you don't like English, well have you considered speaking German instead ?"

Well that is it, I am done. I was asked how XDG video could be useful. I
explained the shortcomings of what exists currently, and outlined various
ways in which having a standard could be useful.

But if there is no will for this, then I am not going to waste any more of
my time on this. My own standards work very well for my own purposes, and
if I ever wanted to use pipewire for example, I can simply add the
constants to my compatibility header.


Cheers.
G,




On Fri, 5 Apr 2024 at 06:34, Pekka Paalanen 
wrote:

> On Thu, 4 Apr 2024 17:13:40 -0300
> salsaman  wrote:
>
> > Hi,
> > the problem with the drm.h header is, it is complicated, still needs
> > interpretation, and it lacks some commonly used formats, (e.g YUVAp)
>
> They accept additions, if the additions serve userspace
> interoperability. There is no requirement to actually use the format in
> the kernel.
>
> Pixel formats are complicated, yes. There are too many pixel format
> enumerations, every one differently defined, sure. I wouldn't add yet
> another system of definitions.
>
> > Also it doesn't address the gamma value (linear, sRGB, bt701), or the yuv
> > subspace, (eg Y'CbCr vs bt701), the yuv ramge (16 - 240. 16 - 235 =
> clamped
> > / mpeg. 0 - 255 unclamped, full, jpeg range) or uv sampling position, e.g
> > center, top_left)
>
> My opinion is that that none of that is relevant to a pixel format.
> These are additional information that must be decoupled from the pixel
> format to avoid a combinatorial explosion of the format enumeration,
> which is already massive even without them. A pixel format only
> describes a part of the memory layout: which set of bits forms a raw
> channel value of a pixel, and what are the channel names. Giving any
> further meaning to those raw values is for other metadata.
>
> What about colorimetry? Primaries and white point, dynamic range, plus
> the difference between encoding colorimetry (container color volume)
> and the usable/used colorimetry (target color volume, which is present
> in e.g. HDR static metadata typical for BT.2100/PQ signals in the form
> of the Mastering Display Color Volume).
>
> What about the assumed viewing environment, if we want to go from just
> stimulus towards appearance?
>
> > I can see that having some common definitions would be useful for
> > exchanging data between applications. Eg  my app gets a frame buffer and
> > metadata XDG_VIDEO_PALETTE_RGB24, XDG_VIDEO_GAMMA_LINEAR
> > then I know unambiguously that this is planar RGB 8:8:8 (so forget
> little /
> > big endian) and that the values are encoded with linear (not sRGB) gamma.
>
> > If you want to be more specific with palettes, then you could do so, but
> it
> > might require defining metadata structs,
>
> > I'll try to explain the rationale a bit. In the audio world it is quite
> > common for apps to send audio from one to another. Generally 

Re: Video standards

2024-04-04 Thread salsaman
I missed out a link in an earlier email, this is the dbus player control
which I mentioned:
https://www.freedesktop.org/wiki/Specifications/mpris-spec/


A dbus <---> OSC endpoint could actually make this super useful, perhaps I
will do that when I have some time,


On Thu, 4 Apr 2024 at 21:57, salsaman  wrote:

> Just to re emphasise, the nearest we have presently are AV_PIXFMT_*  which
> are library specific values, and lack some important values
> (there is no yuv888 packed for example), And the drm,h file is based on
> monitor standards, and also lacks values like 'R', 'G', 'B', 'A' *
>
> I think we can agree there is a gap that could be filled by an agreed set
> of definitions. I dont mean technical definitions, we can just point to the
> standards
>  e.g https://glenwing.github.io/docs/ITU-R-BT.709-1.pdf
> and an enumeration XDG_VIDEO_GAMMA_BT709 (== XDG_VIDEO_GAMMA_ITU709)
>
> G,
>
> * (also I would dispute their ambiguous yuv411 definition - if it were
> yuv411p I would agree, otherwise it could be the camera format UYYVYY
> packed).
>
> On Thu, 4 Apr 2024 at 18:40, salsaman  wrote:
>
>> I'll try to explain the rationale a bit. In the audio world it is quite
>> common for apps to send audio from one to another. Generally speaking they
>> would send or receive via an audio server, e.g pulseaudio, jack.
>> Now imagine the same for video, let us suppose you have an app that
>> generates video effects from audio. Now you want to send the output to
>> another app, let's say you have a super optimised openGL video player.
>> You could imagine connecting the 2 apps via dbus for example. The first
>> app, the generator, sends a frame sync signal each time a frame is
>> produced, and includes a pointer to the frame buffer, and the frame size.
>> But how does it describe the format of the frame pixel data ? Is it RGB24
>> ? yuv420p ? if it is rgb, is it sRGB gamma or linear ?
>> Well, you could maybe guess the first 4 bytes are a fourcc code. Then you
>> write a lot of code to parse the 4cc and figure out what it might be,
>> Or the easier way, you query the app and it responds with XDG constants.
>>
>> G,
>>
>> On Thu, 4 Apr 2024 at 17:13, salsaman  wrote:
>>
>>> Hi,
>>> the problem with the drm.h header is, it is complicated, still needs
>>> interpretation, and it lacks some commonly used formats, (e.g YUVAp)
>>> Also it doesn't address the gamma value (linear, sRGB, bt701), or the
>>> yuv subspace, (eg Y'CbCr vs bt701), the yuv ramge (16 - 240. 16 - 235 =
>>> clamped / mpeg. 0 - 255 unclamped, full, jpeg range) or uv sampling
>>> position, e.g center, top_left)
>>>
>>> I can see that having some common definitions would be useful for
>>> exchanging data between applications. Eg  my app gets a frame buffer and
>>> metadata XDG_VIDEO_PALETTE_RGB24, XDG_VIDEO_GAMMA_LINEAR
>>> then I know unambiguously that this is planar RGB 8:8:8 (so forget
>>> little / big endian) and that the values are encoded with linear (not sRGB)
>>> gamma.
>>>
>>> If you want to be more specific with palettes, then you could do so, but
>>> it might require defining metadata structs,
>>>
>>> For example for my own standard (Weed effects) I have:
>>>
>>> // max number of channels in a palette
>>>
>>>
>>>
>>> #ifndef WEED_MAXPCHANS
>>> #define WEED_MAXPCHANS 8
>>> #endif
>>>
>>> // max number of planes in a palette
>>>
>>>
>>>
>>> #ifndef WEED_MAXPPLANES
>>> #define WEED_MAXPPLANES 4
>>> #endif
>>>
>>> #define WEED_VCHAN_end  0
>>>
>>> #define WEED_VCHAN_red  1
>>> #define WEED_VCHAN_green2
>>> #define WEED_VCHAN_blue 3
>>>
>>> #define WEED_VCHAN_Y512
>>> #define WEED_VCHAN_U513
>>> #define WEED_VCHAN_V514
>>>
>>> #define WEED_VCHAN_alpha1024
>>>
>>> #define WEED_VCHAN_FIRST_CUSTOM 8192
>>>
>>> #define WEED_VCHAN_DESC_PLANAR  (1 << 0) ///< planar type
>>>
>>>
>>>
>>> #define WEED_VCHAN_DESC_FP  (1 << 1) ///< floating point
>>> type
>>>
>>>
>>> #define WEED_VCHAN_DESC_BE  (1 << 2) ///< pixel data is big
>>> endian (within each component)
>>>
>>>
>>>
>>> #define WEED_VCHAN_DESC_FIRST_CUSTOM(1 << 16)
>>&

Re: Video standards

2024-04-04 Thread salsaman
Just to re emphasise, the nearest we have presently are AV_PIXFMT_*  which
are library specific values, and lack some important values
(there is no yuv888 packed for example), And the drm,h file is based on
monitor standards, and also lacks values like 'R', 'G', 'B', 'A' *

I think we can agree there is a gap that could be filled by an agreed set
of definitions. I dont mean technical definitions, we can just point to the
standards
 e.g https://glenwing.github.io/docs/ITU-R-BT.709-1.pdf
and an enumeration XDG_VIDEO_GAMMA_BT709 (== XDG_VIDEO_GAMMA_ITU709)

G,

* (also I would dispute their ambiguous yuv411 definition - if it were
yuv411p I would agree, otherwise it could be the camera format UYYVYY
packed).

On Thu, 4 Apr 2024 at 18:40, salsaman  wrote:

> I'll try to explain the rationale a bit. In the audio world it is quite
> common for apps to send audio from one to another. Generally speaking they
> would send or receive via an audio server, e.g pulseaudio, jack.
> Now imagine the same for video, let us suppose you have an app that
> generates video effects from audio. Now you want to send the output to
> another app, let's say you have a super optimised openGL video player.
> You could imagine connecting the 2 apps via dbus for example. The first
> app, the generator, sends a frame sync signal each time a frame is
> produced, and includes a pointer to the frame buffer, and the frame size.
> But how does it describe the format of the frame pixel data ? Is it RGB24
> ? yuv420p ? if it is rgb, is it sRGB gamma or linear ?
> Well, you could maybe guess the first 4 bytes are a fourcc code. Then you
> write a lot of code to parse the 4cc and figure out what it might be,
> Or the easier way, you query the app and it responds with XDG constants.
>
> G,
>
> On Thu, 4 Apr 2024 at 17:13, salsaman  wrote:
>
>> Hi,
>> the problem with the drm.h header is, it is complicated, still needs
>> interpretation, and it lacks some commonly used formats, (e.g YUVAp)
>> Also it doesn't address the gamma value (linear, sRGB, bt701), or the yuv
>> subspace, (eg Y'CbCr vs bt701), the yuv ramge (16 - 240. 16 - 235 = clamped
>> / mpeg. 0 - 255 unclamped, full, jpeg range) or uv sampling position, e.g
>> center, top_left)
>>
>> I can see that having some common definitions would be useful for
>> exchanging data between applications. Eg  my app gets a frame buffer and
>> metadata XDG_VIDEO_PALETTE_RGB24, XDG_VIDEO_GAMMA_LINEAR
>> then I know unambiguously that this is planar RGB 8:8:8 (so forget little
>> / big endian) and that the values are encoded with linear (not sRGB) gamma.
>>
>> If you want to be more specific with palettes, then you could do so, but
>> it might require defining metadata structs,
>>
>> For example for my own standard (Weed effects) I have:
>>
>> // max number of channels in a palette
>>
>>
>>
>> #ifndef WEED_MAXPCHANS
>> #define WEED_MAXPCHANS 8
>> #endif
>>
>> // max number of planes in a palette
>>
>>
>>
>> #ifndef WEED_MAXPPLANES
>> #define WEED_MAXPPLANES 4
>> #endif
>>
>> #define WEED_VCHAN_end  0
>>
>> #define WEED_VCHAN_red  1
>> #define WEED_VCHAN_green2
>> #define WEED_VCHAN_blue 3
>>
>> #define WEED_VCHAN_Y512
>> #define WEED_VCHAN_U513
>> #define WEED_VCHAN_V514
>>
>> #define WEED_VCHAN_alpha1024
>>
>> #define WEED_VCHAN_FIRST_CUSTOM 8192
>>
>> #define WEED_VCHAN_DESC_PLANAR  (1 << 0) ///< planar type
>>
>>
>>
>> #define WEED_VCHAN_DESC_FP  (1 << 1) ///< floating point type
>>
>>
>>
>> #define WEED_VCHAN_DESC_BE  (1 << 2) ///< pixel data is big
>> endian (within each component)
>>
>>
>>
>> #define WEED_VCHAN_DESC_FIRST_CUSTOM(1 << 16)
>>
>> typedef struct {
>>   uint16_t ext_ref;  ///< link to an enumerated type
>>
>>
>>
>>   uint16_t chantype[WEED_MAXPCHANS]; ///  e.g. {WEED_VCHAN_U,
>> WEED_VCHAN_Y, WEED_VCHAN_V, WEED_VCHAN_Y)
>>
>>
>>   uint32_t flags; /// bitmap of flags, eg. WEED_VCHAN_DESC_FP |
>> WEED_VCHAN_DESC_PLANAR
>>
>>
>>   uint8_t  hsub[WEED_MAXPCHANS];  /// horiz. subsampling, 0 or 1 means no
>> subsampling, 2 means halved etc. (planar only)
>>uint8_t  vsub[WEED_MAXPCHANS];  /// vert subsampling
>>
>>
>>
>>   uint8_t npixels; ///< npixels per macropixel: {0, 1} == 1
>>
>>
>>
>>   uint8_t bitsize[WEED_MAX

Re: Video standards

2024-04-04 Thread salsaman
I'll try to explain the rationale a bit. In the audio world it is quite
common for apps to send audio from one to another. Generally speaking they
would send or receive via an audio server, e.g pulseaudio, jack.
Now imagine the same for video, let us suppose you have an app that
generates video effects from audio. Now you want to send the output to
another app, let's say you have a super optimised openGL video player.
You could imagine connecting the 2 apps via dbus for example. The first
app, the generator, sends a frame sync signal each time a frame is
produced, and includes a pointer to the frame buffer, and the frame size.
But how does it describe the format of the frame pixel data ? Is it RGB24 ?
yuv420p ? if it is rgb, is it sRGB gamma or linear ?
Well, you could maybe guess the first 4 bytes are a fourcc code. Then you
write a lot of code to parse the 4cc and figure out what it might be,
Or the easier way, you query the app and it responds with XDG constants.

G,

On Thu, 4 Apr 2024 at 17:13, salsaman  wrote:

> Hi,
> the problem with the drm.h header is, it is complicated, still needs
> interpretation, and it lacks some commonly used formats, (e.g YUVAp)
> Also it doesn't address the gamma value (linear, sRGB, bt701), or the yuv
> subspace, (eg Y'CbCr vs bt701), the yuv ramge (16 - 240. 16 - 235 = clamped
> / mpeg. 0 - 255 unclamped, full, jpeg range) or uv sampling position, e.g
> center, top_left)
>
> I can see that having some common definitions would be useful for
> exchanging data between applications. Eg  my app gets a frame buffer and
> metadata XDG_VIDEO_PALETTE_RGB24, XDG_VIDEO_GAMMA_LINEAR
> then I know unambiguously that this is planar RGB 8:8:8 (so forget little
> / big endian) and that the values are encoded with linear (not sRGB) gamma.
>
> If you want to be more specific with palettes, then you could do so, but
> it might require defining metadata structs,
>
> For example for my own standard (Weed effects) I have:
>
> // max number of channels in a palette
>
>
>
> #ifndef WEED_MAXPCHANS
> #define WEED_MAXPCHANS 8
> #endif
>
> // max number of planes in a palette
>
>
>
> #ifndef WEED_MAXPPLANES
> #define WEED_MAXPPLANES 4
> #endif
>
> #define WEED_VCHAN_end  0
>
> #define WEED_VCHAN_red  1
> #define WEED_VCHAN_green2
> #define WEED_VCHAN_blue 3
>
> #define WEED_VCHAN_Y512
> #define WEED_VCHAN_U513
> #define WEED_VCHAN_V514
>
> #define WEED_VCHAN_alpha1024
>
> #define WEED_VCHAN_FIRST_CUSTOM 8192
>
> #define WEED_VCHAN_DESC_PLANAR  (1 << 0) ///< planar type
>
>
>
> #define WEED_VCHAN_DESC_FP  (1 << 1) ///< floating point type
>
>
>
> #define WEED_VCHAN_DESC_BE  (1 << 2) ///< pixel data is big
> endian (within each component)
>
>
>
> #define WEED_VCHAN_DESC_FIRST_CUSTOM(1 << 16)
>
> typedef struct {
>   uint16_t ext_ref;  ///< link to an enumerated type
>
>
>
>   uint16_t chantype[WEED_MAXPCHANS]; ///  e.g. {WEED_VCHAN_U,
> WEED_VCHAN_Y, WEED_VCHAN_V, WEED_VCHAN_Y)
>
>
>   uint32_t flags; /// bitmap of flags, eg. WEED_VCHAN_DESC_FP |
> WEED_VCHAN_DESC_PLANAR
>
>
>   uint8_t  hsub[WEED_MAXPCHANS];  /// horiz. subsampling, 0 or 1 means no
> subsampling, 2 means halved etc. (planar only)
>uint8_t  vsub[WEED_MAXPCHANS];  /// vert subsampling
>
>
>
>   uint8_t npixels; ///< npixels per macropixel: {0, 1} == 1
>
>
>
>   uint8_t bitsize[WEED_MAXPCHANS]; // 8 if not specified
>   void *extended; ///< pointer to app defined data
>
>
>
> } weed_macropixel_t;
>
> Then I can describe all my palettes like:
> advp[0] = (weed_macropixel_t) {
> WEED_PALETTE_RGB24,
> {WEED_VCHAN_red, WEED_VCHAN_green, WEED_VCHAN_blue}
>   };
>
>  advp[6] = (weed_macropixel_t) {
> WEED_PALETTE_RGBAFLOAT,
> {WEED_VCHAN_red, WEED_VCHAN_green, WEED_VCHAN_blue, WEED_VCHAN_alpha},
> WEED_VCHAN_DESC_FP, {0}, {0}, 1, {32, 32, 32, 32}
>   };
>
>  advp[7] = (weed_macropixel_t) {
> WEED_PALETTE_YUV420P,
> {WEED_VCHAN_Y, WEED_VCHAN_U, WEED_VCHAN_V},
> WEED_VCHAN_DESC_PLANAR, {1, 2, 2}, {1, 2, 2}
>   };
>
> IMO this is way superior to fourcc and if you were to supplement this with
> gamma, interlace, yuv subspace, yuv clamping and yuv sampling, then you
> would have a very comprehensive definition for any type of video frame.
>
> G.
>
>
>
> On Thu, 4 Apr 2024 at 08:52, Pekka Paalanen 
> wrote:
>
>> On Wed, 3 Apr 2024 21:51:39 -0300
>> salsaman  wrote:
>>
>> > Regarding my expertise, I was one of the developers mo

Re: Video standards

2024-04-04 Thread salsaman
Hi,
the problem with the drm.h header is, it is complicated, still needs
interpretation, and it lacks some commonly used formats, (e.g YUVAp)
Also it doesn't address the gamma value (linear, sRGB, bt701), or the yuv
subspace, (eg Y'CbCr vs bt701), the yuv ramge (16 - 240. 16 - 235 = clamped
/ mpeg. 0 - 255 unclamped, full, jpeg range) or uv sampling position, e.g
center, top_left)

I can see that having some common definitions would be useful for
exchanging data between applications. Eg  my app gets a frame buffer and
metadata XDG_VIDEO_PALETTE_RGB24, XDG_VIDEO_GAMMA_LINEAR
then I know unambiguously that this is planar RGB 8:8:8 (so forget little /
big endian) and that the values are encoded with linear (not sRGB) gamma.

If you want to be more specific with palettes, then you could do so, but it
might require defining metadata structs,

For example for my own standard (Weed effects) I have:

// max number of channels in a palette



#ifndef WEED_MAXPCHANS
#define WEED_MAXPCHANS 8
#endif

// max number of planes in a palette



#ifndef WEED_MAXPPLANES
#define WEED_MAXPPLANES 4
#endif

#define WEED_VCHAN_end  0

#define WEED_VCHAN_red  1
#define WEED_VCHAN_green2
#define WEED_VCHAN_blue 3

#define WEED_VCHAN_Y512
#define WEED_VCHAN_U513
#define WEED_VCHAN_V514

#define WEED_VCHAN_alpha1024

#define WEED_VCHAN_FIRST_CUSTOM 8192

#define WEED_VCHAN_DESC_PLANAR  (1 << 0) ///< planar type



#define WEED_VCHAN_DESC_FP  (1 << 1) ///< floating point type



#define WEED_VCHAN_DESC_BE  (1 << 2) ///< pixel data is big
endian (within each component)



#define WEED_VCHAN_DESC_FIRST_CUSTOM(1 << 16)

typedef struct {
  uint16_t ext_ref;  ///< link to an enumerated type



  uint16_t chantype[WEED_MAXPCHANS]; ///  e.g. {WEED_VCHAN_U, WEED_VCHAN_Y,
WEED_VCHAN_V, WEED_VCHAN_Y)


  uint32_t flags; /// bitmap of flags, eg. WEED_VCHAN_DESC_FP |
WEED_VCHAN_DESC_PLANAR


  uint8_t  hsub[WEED_MAXPCHANS];  /// horiz. subsampling, 0 or 1 means no
subsampling, 2 means halved etc. (planar only)
   uint8_t  vsub[WEED_MAXPCHANS];  /// vert subsampling



  uint8_t npixels; ///< npixels per macropixel: {0, 1} == 1



  uint8_t bitsize[WEED_MAXPCHANS]; // 8 if not specified
  void *extended; ///< pointer to app defined data



} weed_macropixel_t;

Then I can describe all my palettes like:
advp[0] = (weed_macropixel_t) {
WEED_PALETTE_RGB24,
{WEED_VCHAN_red, WEED_VCHAN_green, WEED_VCHAN_blue}
  };

 advp[6] = (weed_macropixel_t) {
WEED_PALETTE_RGBAFLOAT,
{WEED_VCHAN_red, WEED_VCHAN_green, WEED_VCHAN_blue, WEED_VCHAN_alpha},
WEED_VCHAN_DESC_FP, {0}, {0}, 1, {32, 32, 32, 32}
  };

 advp[7] = (weed_macropixel_t) {
WEED_PALETTE_YUV420P,
{WEED_VCHAN_Y, WEED_VCHAN_U, WEED_VCHAN_V},
WEED_VCHAN_DESC_PLANAR, {1, 2, 2}, {1, 2, 2}
  };

IMO this is way superior to fourcc and if you were to supplement this with
gamma, interlace, yuv subspace, yuv clamping and yuv sampling, then you
would have a very comprehensive definition for any type of video frame.

G.



On Thu, 4 Apr 2024 at 08:52, Pekka Paalanen 
wrote:

> On Wed, 3 Apr 2024 21:51:39 -0300
> salsaman  wrote:
>
> > Regarding my expertise, I was one of the developers most involved in
> > developing the "livido" standard which was one of the main topics of the
> > Piksel Festivals held in Bergen, Norway.
> > In the early days (2004 - 2006) the focus of the annual event was
> precisely
> > the formulation of free / open standards, in this case for video effects.
> > Other contributors included:
> >  Niels Elburg, Denis "Jaromil" Rojo, Tom Schouten, Andraz Tori, Kentaro
> > Fukuchi and Carlo Prelz.
> > I've also been involved with and put forward proposals for common
> command /
> > query / reply actions (Open Media Control). To the extent that these
> > proposals have not gained traction, I don't ascribe this to a failing in
> > the proposals, but rather to a lack of developer awareness.
> >
> > Now regarding specific areas, I went back and reviewed some of the
> > available material at  https://www.freedesktop.org/wiki/Specifications/
> >
> > free media player specifications
> > https://www.freedesktop.org/wiki/Specifications/free-media-player-specs/
> > metadata standards for things like comments and ratings - talks mainly
> > about audio but describes video files also
> >
> > I am not a big fan of dbus, but this looks fine, it could be used for
> video
> > players. I'd be happier if it were a bit more abstracted and not tied to
> a
> > specific implementation (dbus). I could suggest some enhancements but I
> > guess this is a dbus thing and not an xdg thing.
>
> Thanks

Re: Video standards

2024-04-03 Thread salsaman
Regarding my expertise, I was one of the developers most involved in
developing the "livido" standard which was one of the main topics of the
Piksel Festivals held in Bergen, Norway.
In the early days (2004 - 2006) the focus of the annual event was precisely
the formulation of free / open standards, in this case for video effects.
Other contributors included:
 Niels Elburg, Denis "Jaromil" Rojo, Tom Schouten, Andraz Tori, Kentaro
Fukuchi and Carlo Prelz.
I've also been involved with and put forward proposals for common command /
query / reply actions (Open Media Control). To the extent that these
proposals have not gained traction, I don't ascribe this to a failing in
the proposals, but rather to a lack of developer awareness.

Now regarding specific areas, I went back and reviewed some of the
available material at  https://www.freedesktop.org/wiki/Specifications/

free media player specifications
https://www.freedesktop.org/wiki/Specifications/free-media-player-specs/
metadata standards for things like comments and ratings - talks mainly
about audio but describes video files also

I am not a big fan of dbus, but this looks fine, it could be used for video
players. I'd be happier if it were a bit more abstracted and not tied to a
specific implementation (dbus). I could suggest some enhancements but I
guess this is a dbus thing and not an xdg thing.

IMO what would be useful would be to define a common set of constants, most
specifically related to frame pixel fornats
The 2 most common in use are fourCC and avformat

Consider a frame in UYVY fornat

fourCC values:

 #define MK_FOURCC(a, b, c, d) (((uint32_t)a) | (((uint32_t)b) << 8) \
   | (((uint32_t)c) << 16) | (((uint32_t)d) <<
24))

MK_FOURCC('U', 'Y', 'V', 'Y')
but also
MK_FOURCC('I', 'U', 'Y', 'B')
the same but with interlacing
MK_FOURCC('H', 'D', 'Y', 'C')
same but bt709 (hdtv) encoding

so this requires interpretation by sender / receiver - a simpler way could
be with constants

- probably the nearest we have are ffmpeg / libav definitions, but this is
the wrong way around, a lib shouldn't define a global standard, the
standard should come first and the lib should align to that.

We have AV_PIX_FMT_UYVY422 which was formerly PIX_FMT_UYVY422
and AVCOL_TRC_BT709, which is actually the gamma transfer function, There
is no equivalent bt709 constant fot bt709 yuv / rgb, instead this exists as
a matrix.

Now consider how much easier it would be to share data if we had the
following constants enumerated:

*XDG_VIDEO_PALETTE_UYVY*
*XDG_VIDEO_INTERLACE_TOP_FIRST*
*XDG_VIDEO_YUV_SUBSPACE_BT709*
*XDG_VIDEO_GAMMA_SRGB*

(this is an invented example, not intended to be a real example).

There is a bit more too it but that should be enough to give a general idea.

G,






On Wed, 3 Apr 2024 at 08:12, Pekka Paalanen 
wrote:

> On Thu, 28 Mar 2024 19:19:33 -0300
> salsaman  wrote:
>
> > There are two hardware settings from the monitor that overlap video,
> these
> > are
> > - monitor aspect ratio
> > - monitor pixel aspect ratio
> > These are both useful when rendering video. The first defines how much
> > stretch or letterbocing to apply, the second defines non square pixels,
> > which is goof to know if you want to render fixed size objects (a circle
> > for example). Knowing the monitor size in RGB or Y plane pixels can also
> be
> > useful to define a max or min resize limit (whether it is min or max
> > depends on the desired display quality level)
>
> Thanks. I was trying to ask what kind of video standards you have
> experience and expertise in?
>
> I'm also interested in what kind of standards you see as missing. The
> Wayland extension aims to cover everything display related. I'm sure
> video file format specifications do their job.
>
> What would be left to define?
>
> What goals would there be?
>
> I suppose individual APIs like Pipewire might be lacking something, but
> that's a Pipewire API rather than an XDG standard. Or do we need an XDG
> standard to be used as the design guide and reference for APIs?
>
>
> Thanks,
> pq
>
> > On Thu, 28 Mar 2024 at 19:05, salsaman  wrote:
> >
> > > colour management and hdr mostly intersect with three areas of video:
> > > pixel formats, yuv <-> rgb conversions and gamma transfer functions.
> > > For example
> > > xdg_pixformat_yuv121010
> > > xdg_subspace_bt2020
> > > xdg_gamma_bt2020
> > >
> > > just off the top of my head, these arent intended to be actual
> suggestions
> > >
> > >
> > > On Thu, 28 Mar 2024 at 18:57, salsaman  wrote:
> > >
> > >> In addition, I am not sure if there are xdg standards for audio, but I
> > >> would suggest video and follow s

Re: Video standards

2024-03-28 Thread salsaman
There are two hardware settings from the monitor that overlap video, these
are
- monitor aspect ratio
- monitor pixel aspect ratio
These are both useful when rendering video. The first defines how much
stretch or letterbocing to apply, the second defines non square pixels,
which is goof to know if you want to render fixed size objects (a circle
for example). Knowing the monitor size in RGB or Y plane pixels can also be
useful to define a max or min resize limit (whether it is min or max
depends on the desired display quality level)

On Thu, 28 Mar 2024 at 19:05, salsaman  wrote:

> colour management and hdr mostly intersect with three areas of video:
> pixel formats, yuv <-> rgb conversions and gamma transfer functions.
> For example
> xdg_pixformat_yuv121010
> xdg_subspace_bt2020
> xdg_gamma_bt2020
>
> just off the top of my head, these arent intended to be actual suggestions
>
>
> On Thu, 28 Mar 2024 at 18:57, salsaman  wrote:
>
>> In addition, I am not sure if there are xdg standards for audio, but I
>> would suggest video and follow similar hierarchies, and that both could be
>> classed under a more generic xdg multimedia standard.
>>
>>
>> On Thu, 28 Mar 2024 at 18:48, salsaman  wrote:
>>
>>> Hi, IMO hardware related would be more appropriate under display
>>> standards
>>> Video standards could be more software related, and provide common
>>> definitions, for example , allowing exchange of information between
>>> applications which produce or consume video frames or streams of frames.
>>> Some examples I can think of might be
>>>  xdg_colorspace_RGB,
>>>  xdg_colorspace_YUV
>>>
>>> xdg_pixfmt_RGB24
>>> xdg_pixfmt_YUV420p
>>> etc
>>>
>>>  xdg_gamma_linear
>>>  xdg_gamma_sRGB
>>>
>>> xdg_video_width
>>> xdg_video_height
>>>
>>> I could provide a more full list, but I think if it goes along this
>>> route. the starting point has to be what are we setting out to achieve with
>>> the standards / definitions, and provide a range of speculative use cases.
>>>
>>> Gabriel (salsaman)
>>>
>>>
>>> On Thu, 28 Mar 2024 at 06:07, Pekka Paalanen <
>>> pekka.paala...@haloniitty.fi> wrote:
>>>
>>>> On Wed, 27 Mar 2024 11:45:00 -0300
>>>> salsaman  wrote:
>>>>
>>>> > ISTR that the xdg video standards were never defined, If you need any
>>>> > advice or assistance with this, I would be happy to act in an
>>>> > advisory capacity if that is called for. I have over 20 years
>>>> experience of
>>>> > developing Free Software video and have been an active participant in
>>>> > developing other video / effects standards. I have been a bit our of
>>>> the
>>>> > spotlight recently as I have been busy architecting and implementing
>>>> the
>>>> > core components of the upcoming next gen LiVES 4,0 video application
>>>> plus
>>>> > its accompanying state-of-the-art effects standard)
>>>>
>>>> Hi,
>>>>
>>>> what kind of video standards would that be?
>>>>
>>>> I'm wondering if it would have anything to do with Wayland color
>>>> management and HDR:
>>>>
>>>> https://gitlab.freedesktop.org/pq/color-and-hdr
>>>>
>>>> https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/183
>>>>
>>>> https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/14
>>>>
>>>> Would there need to be any XDG standards to support color managed HDR
>>>> desktops, or is the window system support enough?
>>>>
>>>> I have not much in my mind, but then I've been staring only at the
>>>> window system interactions, and haven't seen what else the desktop
>>>> ecosystem or applications might need.
>>>>
>>>> Recommended display calibration and measurement procedures maybe?
>>>>
>>>> Desktop viewing environment standards?
>>>>
>>>> Viewing environment measurement?
>>>>
>>>> They could be as straightforward as referring to a freely available
>>>> ITU-R or SMPTE papers or others, if there are suitable ones.
>>>>
>>>>
>>>> Thanks,
>>>> pq
>>>>
>>>


Re: Video standards

2024-03-28 Thread salsaman
colour management and hdr mostly intersect with three areas of video:
pixel formats, yuv <-> rgb conversions and gamma transfer functions.
For example
xdg_pixformat_yuv121010
xdg_subspace_bt2020
xdg_gamma_bt2020

just off the top of my head, these arent intended to be actual suggestions


On Thu, 28 Mar 2024 at 18:57, salsaman  wrote:

> In addition, I am not sure if there are xdg standards for audio, but I
> would suggest video and follow similar hierarchies, and that both could be
> classed under a more generic xdg multimedia standard.
>
>
> On Thu, 28 Mar 2024 at 18:48, salsaman  wrote:
>
>> Hi, IMO hardware related would be more appropriate under display standards
>> Video standards could be more software related, and provide common
>> definitions, for example , allowing exchange of information between
>> applications which produce or consume video frames or streams of frames.
>> Some examples I can think of might be
>>  xdg_colorspace_RGB,
>>  xdg_colorspace_YUV
>>
>> xdg_pixfmt_RGB24
>> xdg_pixfmt_YUV420p
>> etc
>>
>>  xdg_gamma_linear
>>  xdg_gamma_sRGB
>>
>> xdg_video_width
>> xdg_video_height
>>
>> I could provide a more full list, but I think if it goes along this
>> route. the starting point has to be what are we setting out to achieve with
>> the standards / definitions, and provide a range of speculative use cases.
>>
>> Gabriel (salsaman)
>>
>>
>> On Thu, 28 Mar 2024 at 06:07, Pekka Paalanen <
>> pekka.paala...@haloniitty.fi> wrote:
>>
>>> On Wed, 27 Mar 2024 11:45:00 -0300
>>> salsaman  wrote:
>>>
>>> > ISTR that the xdg video standards were never defined, If you need any
>>> > advice or assistance with this, I would be happy to act in an
>>> > advisory capacity if that is called for. I have over 20 years
>>> experience of
>>> > developing Free Software video and have been an active participant in
>>> > developing other video / effects standards. I have been a bit our of
>>> the
>>> > spotlight recently as I have been busy architecting and implementing
>>> the
>>> > core components of the upcoming next gen LiVES 4,0 video application
>>> plus
>>> > its accompanying state-of-the-art effects standard)
>>>
>>> Hi,
>>>
>>> what kind of video standards would that be?
>>>
>>> I'm wondering if it would have anything to do with Wayland color
>>> management and HDR:
>>>
>>> https://gitlab.freedesktop.org/pq/color-and-hdr
>>>
>>> https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/183
>>>
>>> https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/14
>>>
>>> Would there need to be any XDG standards to support color managed HDR
>>> desktops, or is the window system support enough?
>>>
>>> I have not much in my mind, but then I've been staring only at the
>>> window system interactions, and haven't seen what else the desktop
>>> ecosystem or applications might need.
>>>
>>> Recommended display calibration and measurement procedures maybe?
>>>
>>> Desktop viewing environment standards?
>>>
>>> Viewing environment measurement?
>>>
>>> They could be as straightforward as referring to a freely available
>>> ITU-R or SMPTE papers or others, if there are suitable ones.
>>>
>>>
>>> Thanks,
>>> pq
>>>
>>


Re: Video standards

2024-03-28 Thread salsaman
In addition, I am not sure if there are xdg standards for audio, but I
would suggest video and follow similar hierarchies, and that both could be
classed under a more generic xdg multimedia standard.


On Thu, 28 Mar 2024 at 18:48, salsaman  wrote:

> Hi, IMO hardware related would be more appropriate under display standards
> Video standards could be more software related, and provide common
> definitions, for example , allowing exchange of information between
> applications which produce or consume video frames or streams of frames.
> Some examples I can think of might be
>  xdg_colorspace_RGB,
>  xdg_colorspace_YUV
>
> xdg_pixfmt_RGB24
> xdg_pixfmt_YUV420p
> etc
>
>  xdg_gamma_linear
>  xdg_gamma_sRGB
>
> xdg_video_width
> xdg_video_height
>
> I could provide a more full list, but I think if it goes along this
> route. the starting point has to be what are we setting out to achieve with
> the standards / definitions, and provide a range of speculative use cases.
>
> Gabriel (salsaman)
>
>
> On Thu, 28 Mar 2024 at 06:07, Pekka Paalanen 
> wrote:
>
>> On Wed, 27 Mar 2024 11:45:00 -0300
>> salsaman  wrote:
>>
>> > ISTR that the xdg video standards were never defined, If you need any
>> > advice or assistance with this, I would be happy to act in an
>> > advisory capacity if that is called for. I have over 20 years
>> experience of
>> > developing Free Software video and have been an active participant in
>> > developing other video / effects standards. I have been a bit our of the
>> > spotlight recently as I have been busy architecting and implementing the
>> > core components of the upcoming next gen LiVES 4,0 video application
>> plus
>> > its accompanying state-of-the-art effects standard)
>>
>> Hi,
>>
>> what kind of video standards would that be?
>>
>> I'm wondering if it would have anything to do with Wayland color
>> management and HDR:
>>
>> https://gitlab.freedesktop.org/pq/color-and-hdr
>>
>> https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/183
>>
>> https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/14
>>
>> Would there need to be any XDG standards to support color managed HDR
>> desktops, or is the window system support enough?
>>
>> I have not much in my mind, but then I've been staring only at the
>> window system interactions, and haven't seen what else the desktop
>> ecosystem or applications might need.
>>
>> Recommended display calibration and measurement procedures maybe?
>>
>> Desktop viewing environment standards?
>>
>> Viewing environment measurement?
>>
>> They could be as straightforward as referring to a freely available
>> ITU-R or SMPTE papers or others, if there are suitable ones.
>>
>>
>> Thanks,
>> pq
>>
>


Re: Video standards

2024-03-28 Thread salsaman
Hi, IMO hardware related would be more appropriate under display standards
Video standards could be more software related, and provide common
definitions, for example , allowing exchange of information between
applications which produce or consume video frames or streams of frames.
Some examples I can think of might be
 xdg_colorspace_RGB,
 xdg_colorspace_YUV

xdg_pixfmt_RGB24
xdg_pixfmt_YUV420p
etc

 xdg_gamma_linear
 xdg_gamma_sRGB

xdg_video_width
xdg_video_height

I could provide a more full list, but I think if it goes along this  route.
the starting point has to be what are we setting out to achieve with the
standards / definitions, and provide a range of speculative use cases.

Gabriel (salsaman)


On Thu, 28 Mar 2024 at 06:07, Pekka Paalanen 
wrote:

> On Wed, 27 Mar 2024 11:45:00 -0300
> salsaman  wrote:
>
> > ISTR that the xdg video standards were never defined, If you need any
> > advice or assistance with this, I would be happy to act in an
> > advisory capacity if that is called for. I have over 20 years experience
> of
> > developing Free Software video and have been an active participant in
> > developing other video / effects standards. I have been a bit our of the
> > spotlight recently as I have been busy architecting and implementing the
> > core components of the upcoming next gen LiVES 4,0 video application plus
> > its accompanying state-of-the-art effects standard)
>
> Hi,
>
> what kind of video standards would that be?
>
> I'm wondering if it would have anything to do with Wayland color
> management and HDR:
>
> https://gitlab.freedesktop.org/pq/color-and-hdr
>
> https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/183
>
> https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/14
>
> Would there need to be any XDG standards to support color managed HDR
> desktops, or is the window system support enough?
>
> I have not much in my mind, but then I've been staring only at the
> window system interactions, and haven't seen what else the desktop
> ecosystem or applications might need.
>
> Recommended display calibration and measurement procedures maybe?
>
> Desktop viewing environment standards?
>
> Viewing environment measurement?
>
> They could be as straightforward as referring to a freely available
> ITU-R or SMPTE papers or others, if there are suitable ones.
>
>
> Thanks,
> pq
>


Re: New maintainer for XDG specs

2024-03-27 Thread salsaman
On Wed, 27 Mar 2024 at 10:47, 90  wrote:

> > Hi all,
> > After a bit of a period of inactivity, Matthias Klumpp has agreed to
> help update and maintain the XDG specs, including the menu and icon-theme
> specs in particular.
> >
> > If you have any suggestions or anything you want addressed, please
> discuss it on the list, or file an issue or merge request.
> >
> > Thanks to Matthias for stepping up, as well as to Bastien Nocera and
> David Faure in particular for all their work over the years to get us to
> this point.
> >
> > Cheers,
> > Daniel
>
> Hi, Matthias. Thank you for stepping in.
>
> Would you consider taking any suggestions for potentially
> improving/overhauling and even finalising the XDG Base Directory
> specification to provide something of a proper standard for applications to
> follow more consistently? As much as I love the spirit of that spec, I
> can't help but feel like there are still some flaws to address,
> particularly when it comes to things like XDG_RUNTIME_DIR's (frankly kind
> of insane) requirements and the somewhat nebulous nature of XDG_CONFIG_DIRS
> and XDG_DATA_DIRS.
>
> Kind regards,
> 90
>

ISTR that the xdg video standards were never defined, If you need any
advice or assistance with this, I would be happy to act in an
advisory capacity if that is called for. I have over 20 years experience of
developing Free Software video and have been an active participant in
developing other video / effects standards. I have been a bit our of the
spotlight recently as I have been busy architecting and implementing the
core components of the upcoming next gen LiVES 4,0 video application plus
its accompanying state-of-the-art effects standard)
-- 
Gabriel Finch, a.k.a Salsaman, a.k.a DJ/VJ Salsa
sals...@gmail.com
*@whatsapp: +55 81 9971 97823 <https://wa.me/5581997197823>*

General manager, E-stud-1.0 Productions, Brazil.
(custom audio / video composition, mastering, and distribution)

Lead developer, the LiVES project, http://lives-video.com, UK / Europe.
(a tool for VJs and Digital Video Artists)

Contact for tailored audio / video solutions; live events / presentations,
research and consultancy regarding the interface between art and technology/
20+ year track record, with proven results. Master in Applied IT, certified
audio production engineer

Atendimento inglês e português.

https://soundcloud.com/salsa-man-1
https://www.youtube.com/c/salsaman
https://github.com/salsaman


Re: Unable to access free media player spec.

2020-08-26 Thread salsaman
Knowing where it is NOT, isn't particularly useful. The obvious solution is
for somebody to upload a copy to gitlab, and then update the link in the
wiki.





http://lives-video.com
https://www.openhub.net/accounts/salsaman


On Mon, 17 Aug 2020 at 02:02, Thayne  wrote:

> gitorious was replaced by gitlab, so I would expect this to be in the
> xdg-specs repo https://gitlab.freedesktop.org/xdg/xdg-specs. However,
> I can't find it there.
>
> Thayne McCombs
>
> On Sun, Aug 16, 2020 at 8:59 PM salsaman  wrote:
> >
> > Hi,
> > I am unable to access Free Media Player Specifications,
> >
> > clicking on the link shown on the page
> https://www.freedesktop.org/wiki/Specifications/free-media-player-specs/
> >
> >
> https://gitorious.org/xdg-specs/xdg-specs/trees/master/specifications/FMPSpecs?p=xdg-specs:xdg-specs.git;a=tree;f=specifications/FMPSpecs;hb=HEAD
> >
> > returns the following error in mozilla:
> >
> > "An error occurred during a connection to gitorious.org. SSL received a
> record that exceeded the maximum permissible length.
> >
> > Error code: SSL_ERROR_RX_RECORD_TOO_LONG"
> >
> > Please can you provide a working link for the spec.
> >
> > Thanks in advance,
> > GF.
> >
> >
> > http://lives-video.com
> > https://www.openhub.net/accounts/salsaman
> >
> >
> > ___
> > xdg mailing list
> > xdg@lists.freedesktop.org
> > https://lists.freedesktop.org/mailman/listinfo/xdg
>
___
xdg mailing list
xdg@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/xdg


Unable to access free media player spec.

2020-08-16 Thread salsaman
Hi,
I am unable to access Free Media Player Specifications,

clicking on the link shown on the page
https://www.freedesktop.org/wiki/Specifications/free-media-player-specs/

https://gitorious.org/xdg-specs/xdg-specs/trees/master/specifications/FMPSpecs?p=xdg-specs:xdg-specs.git;a=tree;f=specifications/FMPSpecs;hb=HEAD

returns the following error in mozilla:

"An error occurred during a connection to gitorious.org. SSL received a
record that exceeded the maximum permissible length.

Error code: SSL_ERROR_RX_RECORD_TOO_LONG"

Please can you provide a working link for the spec.

Thanks in advance,
GF.


http://lives-video.com
https://www.openhub.net/accounts/salsaman
___
xdg mailing list
xdg@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/xdg