Re: [FFmpeg-devel] [PATCH] avfilter: add normalize filter

2017-11-26 Thread Richard Ling
Thanks Paul.

Thanks also to all reviewers for your comments! It's very helpful to have
extra sets of eyes to find my bugs.

Moritz is right, there is an unused #define, I will try to find time to
patch. Or maybe Paul can remove it

Regards
R.
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-devel


Re: [FFmpeg-devel] [PATCH] avfilter: add normalize filter

2017-11-23 Thread Richard Ling
On Nov 21, 2017 10:32 PM, "Moritz Barsnick"  wrote:
>
> Nice. I personally appreciate your code comments, as I'm no big filter
> author (yet).

I've never made any contribution to ffmpeg before, so I'm almost certainly
a bad example to follow :-P

But I do like code to be well commented.

Regards
R.
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-devel


Re: [FFmpeg-devel] [PATCH] avfilter: add normalize filter

2017-11-21 Thread Richard Ling
Updated patch.
The integer overflow is avoided by limiting smoothing parameter to
MAX_INT/8. It is later multiplied by 6.
Regards
R.


0001-avfilter-add-normalize-filter.patch
Description: Binary data
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-devel


Re: [FFmpeg-devel] [PATCH] avfilter: add normalize filter

2017-11-20 Thread Richard Ling
Thanks Moritz.
I'll update and repost a new patch.
I also noticed another error of mine: I hadn't fully updated the examples
in the documentation, after changes to the parameter.
Regards
R.
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-devel


Re: [FFmpeg-devel] [PATCH] avfilter: add normalize filter

2017-11-20 Thread Richard Ling
Patch attached.

R.


0001-avfilter-add-normalize-filter.patch
Description: Binary data
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-devel


Re: [FFmpeg-devel] [PATCH] avfilter: add normalize filter

2017-11-19 Thread Richard Ling
OK, trying again. I've worked out how to send to myself without corruption
(I think).

From 590b3bc8e2675c75c2ff7e75f7fc1fbb1e1a8f71 Mon Sep 17 00:00:00 2001
From: Richard Ling 
Date: Thu, 16 Nov 2017 23:00:01 +1100
Subject: [PATCH] avfilter: add normalize filter

---
 doc/filters.texi   |  80 ++
 libavfilter/Makefile   |   1 +
 libavfilter/allfilters.c   |   1 +
 libavfilter/vf_normalize.c | 387
+
 4 files changed, 469 insertions(+)
 create mode 100644 libavfilter/vf_normalize.c

diff --git a/doc/filters.texi b/doc/filters.texi
index 4a35c44..4cfa031 100644
--- a/doc/filters.texi
+++ b/doc/filters.texi
@@ -10815,6 +10815,86 @@ Add temporal and uniform noise to input video:
 noise=alls=20:allf=t+u
 @end example

+@section normalize
+
+Normalize RGB video (aka histogram stretching, contrast stretching).
+See: https://en.wikipedia.org/wiki/Normalization_(image_processing)
+
+For each channel of each frame, the filter computes the input range and
maps
+it linearly to the user-specified output range. The output range defaults
+to the full dynamic range from pure black to pure white.
+
+Temporal smoothing can be used on the input range to reduce flickering
(rapid
+changes in brightness) caused when small dark or bright objects enter or
leave
+the scene. This is similar to the auto-exposure (automatic gain control)
on a
+video camera, and, like a video camera, it may cause a period of over- or
+under-exposure of the video.
+
+The R,G,B channels can be normalized independently, which may cause some
+color shifting, or linked together as a single channel, which prevents
+color shifting. Linked normalization preserves hue. Independent
normalization
+does not, so it can be used to remove some color casts. Independent and
linked
+normalization can be combined in any ratio.
+
+The normalize filter accepts the following options:
+
+@table @option
+@item blackpt
+@item whitept
+Colors which define the output range. The minimum input value is mapped to
+the @var{blackpt}. The maximum input value is mapped to the @var{whitept}.
+The defaults are black and white respectively. Specifying white for
+@var{blackpt} and black for @var{whitept} will give color-inverted,
+normalized video. Shades of grey can be used to reduce the dynamic range
+(contrast). Specifying saturated colors here can create some interesting
+effects.
+
+@item smoothing
+The number of previous frames to use for temporal smoothing. The input
range
+of each channel is smoothed using a rolling average over the current frame
+and the @var{smoothing} previous frames. The default is 0 (no temporal
+smoothing).
+
+@item independence
+Controls the ratio of independent (color shifting) channel normalization to
+linked (color preserving) normalization. 0.0 is fully linked, 1.0 is fully
+independent. Defaults to 1.0 (fully independent).
+
+@item strength
+Overall strength of the filter. 1.0 is full strength. 0.0 is a rather
+expensive no-op. Defaults to 1.0 (full strength).
+
+@end table
+
+@subsection Examples
+
+Stretch video contrast to use the full dynamic range, with no temporal
+smoothing; may flicker depending on the source content:
+@example
+normalize=blackpt=black:whitept=white:smoothing=0
+@end example
+
+As above, but with 2 seconds of temporal smoothing; flicker should be
+reduced, depending on the source content:
+@example
+normalize=blackpt=black:whitept=white:smoothing=2
+@end example
+
+As above, but with hue-preserving linked channel normalization:
+@example
+normalize=blackpt=black:whitept=white:smoothing=2:independence=1
+@end example
+
+As above, but with half strength:
+@example
+normalize=blackpt=black:whitept=white:smoothing=2:independence=1:strength=0.5
+@end example
+
+Map the darkest input color to red, the brightest input color to cyan:
+@example
+normalize=blackpt=red:whitept=cyan
+@end example
+
 @section null

 Pass the video source unchanged to the output.
diff --git a/libavfilter/Makefile b/libavfilter/Makefile
index b7ddcd2..7843ad8 100644
--- a/libavfilter/Makefile
+++ b/libavfilter/Makefile
@@ -245,6 +245,7 @@ OBJS-$(CONFIG_NLMEANS_FILTER)+=
vf_nlmeans.o
 OBJS-$(CONFIG_NNEDI_FILTER)  += vf_nnedi.o
 OBJS-$(CONFIG_NOFORMAT_FILTER)   += vf_format.o
 OBJS-$(CONFIG_NOISE_FILTER)  += vf_noise.o
+OBJS-$(CONFIG_NORMALIZE_FILTER)  += vf_normalize.o
 OBJS-$(CONFIG_NULL_FILTER)   += vf_null.o
 OBJS-$(CONFIG_OCR_FILTER)+= vf_ocr.o
 OBJS-$(CONFIG_OCV_FILTER)+= vf_libopencv.o
diff --git a/libavfilter/allfilters.c b/libavfilter/allfilters.c
index 3647a11..36e3edb 100644
--- a/libavfilter/allfilters.c
+++ b/libavfilter/allfilters.c
@@ -255,6 +255,7 @@ static void register_all(void)
 REGISTER_FILTER(NNEDI,  nnedi,  vf);
 REGISTER_FILTER(NOFORMAT,   noformat,   vf);
 REGISTER_FILTER(NOISE,  noise,   

Re: [FFmpeg-devel] [PATCH] avfilter: add normalize filter

2017-11-16 Thread Richard Ling
On 24 October 2017 at 07:26, Paul B Mahol  wrote:
> On 9/14/17, Richard Ling  wrote:
>> Hi,
>>
>> This patch adds a filter to normalize (contrast stretch) RGB video.
>> Comments welcome.
>>
>> R.
>
> What's status of this?

I created a new patch based on the feedback from Nicolas, but I was
not able to get Gmail to send it back to me without mangling it.
According to the answer at the bottom of
https://stackoverflow.com/questions/6535761/how-to-email-patches-formatted-with-git-format-patch
it is not possible to send git patches with gmail due to their broken
mailer.
So I just set it aside.

Maybe there's some other way to send a patch (base64, attached zip file, ???)

Best regards
R
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-devel


[FFmpeg-devel] How to tell CFR/VFR video apart, and determine frame rate

2017-09-29 Thread Richard Ling
How can I determine whether a filter input is VFR or CFR?

static int is_vfr(AVFilterLink *inlink)
{
return ???;
}

If I determine the input is CFR, how do I get the frame rate in FPS?
Is it always the reciprocal of timebase, if not what is the correct
implementation?

static float get_constant_frame_rate(AVFilterLink *inlink)
{
return 1.0f / av_q2d(inlink->time_base);  // correct??
}

Thanks in advance...
R
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-devel


Re: [FFmpeg-devel] [PATCH] avfilter: add normalize filter

2017-09-21 Thread Richard Ling
>
>
> Thanks for the patch. Unfortunately, your mail software mangled it with
> line breaks, it cannot be applied as is. Still, see a few comments
> below.
>
>
I should have posted to myself first to make sure it worked OK.
I will do that before posting another patch to the list.

> +The amount of temporal smoothing, expressed in seconds. the input range
of
> +each channel is smoothed using a rolling average over that many seconds
of
> +video. Defaults to 0.0 (no temporal smoothing).  The maximum is 60
seconds.

If I read the code correctly, the rolling average is asymmetrical:
> the current frame is computed using the n previous frames rather than
> the n/2 previous and n/2 next. Which, of course, is the best that can be
> done without buffering that many frames.
>
> But the documentation should probably specify it.
>
>
Yes, it is asymmetrical.
OK, I'll change to ".a rolling average over that many seconds of PREVIOUS
video".
Although, it only makes sense if there is a way to allocate a buffer for
the rolling average based on the frame rate, and you imply below that there
is not.


>
> > +normalize=black:white:0
>
> Better use named options in example. See the drama about that last
> summer.
>

OK, I'll do that.


>
> > +#ifndef MIN
> > +#define MIN(x,y)((x) < (y) ? (x) : (y))
> > +#endif
> > +#ifndef MAX
> > +#define MAX(x,y)((x) > (y) ? (x) : (y))
> > +#endif
>
> FFMIN(), FFMAX()
>

OK, will use those, I didn't know they existed.

> +#define INIT(c) (min[c].in = max[c].in = in->data[0][s->co[c]])
> +#define EXTEND(c)   (min[c].in = MIN(min[c].in, inp[s->co[c]])), \
> +(max[c].in = MAX(max[c].in, inp[s->co[c]]))
> +
> +INIT(0);
> +INIT(1);
> +INIT(2);

I think a loop over c, same as below, would be better style.
>
>
Agreed, will change it

> +// For future use...
> +static av_cold int init(AVFilterContext *ctx)
> +{
> +return 0;
> +}

It can be added when needed.
>
>
Agreed, I will remove that function.

> +// Convert smoothing value (seconds) to history_len (a count of
frames
> to
> +// average, must be at least 1).
> +s->history_len = (int)(s->smoothing / av_q2d(inlink->time_base)) + 1;
> +// In case the frame rate is unusually high, cap it to
MAX_HISTORY_LEN
> +// to avoid allocating stupid amounts of memory.

According to the comments, you are mistaking time_base with the stream's
> frame rate. They are not the same, and the streams are not guaranteed to
> be constant frame rate anyway.
>
> I think you should consider using an exponential moving average instead:
> you can get rid of all the history code. Furthermore, exponential moving
> average supports variable frame rate with just a bit of maths: set the
> decay coefficient to exp(-dt/T); extra bonus: compute exp(-x) as
> 1/exp(x) using a short power series to approximate the exponential.
>

I'm concerned that with exponential rolling average, it is not easy to
choose a suitable decay rate.
Even with a very fast decay rate, the output range depends on every frame
ever seen, no matter how long ago.  There is no upper limit on age.  The
problem gets worse with slower decay rate.  But with a faster decay rate,
there is not enough smoothing (too much bias to most recent frame(s)) so
flickering will return.
The square wave of a rolling average makes more intuitive sense to me. But
I haven't tested it (I've tested existing method extensively).

Is it possible to test for variable frame rate video and make the filter
issue a warning and passthrough (or simply fail, ie force an error)?
I have no need for working with such video.  Other existing temporal
filters must have the same problem. It could be a future enhancement for
the filter, when (if) it is ever needed.


> Regards,
>
> --
>   Nicolas George
>
>
>
Cheers

R.
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-devel


[FFmpeg-devel] [PATCH] avfilter: add normalize filter

2017-09-13 Thread Richard Ling
Hi,

This patch adds a filter to normalize (contrast stretch) RGB video.
Comments welcome.

R.

From f08f132ecd79718d0ce6fb07f99c84ab5dd52ee4 Mon Sep 17 00:00:00 2001
From: Richard Ling 
Date: Thu, 14 Sep 2017 13:18:50 +1000
Subject: [PATCH] avfilter: add normalize filter

---
 doc/filters.texi   |  79 +
 libavfilter/Makefile   |   1 +
 libavfilter/allfilters.c   |   1 +
 libavfilter/vf_normalize.c | 415
+
 4 files changed, 496 insertions(+)
 create mode 100644 libavfilter/vf_normalize.c

diff --git a/doc/filters.texi b/doc/filters.texi
index 830de54..1e7712a 100644
--- a/doc/filters.texi
+++ b/doc/filters.texi
@@ -10808,6 +10808,85 @@ Add temporal and uniform noise to input video:
 noise=alls=20:allf=t+u
 @end example

+@section normalize
+
+Normalize RGB video (aka histogram stretching, contrast stretching).
+See: https://en.wikipedia.org/wiki/Normalization_(image_processing)
+
+For each channel of each frame, the filter computes the input range and
maps
+it linearly to the user-specified output range. The output range defaults
+to the full dynamic range from pure black to pure white.
+
+Temporal smoothing can be used on the input range to reduce flickering
(rapid
+changes in brightness) caused when small dark or bright objects enter or
leave
+the scene. This is similar to the auto-exposure (automatic gain control)
on a
+video camera, and, like a video camera, it may cause a period of over- or
+under-exposure of the video.
+
+The R,G,B channels can be normalized independently, which may cause some
+color shifting, or linked together as a single channel, which prevents
+color shifting. Linked normalization preserves hue. Independent
normalization
+does not, so it can be used to remove some color casts. Independent and
linked
+normalization can be combined in any ratio.
+
+The normalize filter accepts the following options:
+
+@table @option
+@item blackpt
+@item whitept
+Colors which define the output range. The minimum input value is mapped to
+the @var{blackpt}. The maximum input value is mapped to the @var{whitept}.
+The defaults are black and white respectively. Specifying white for
+@var{blackpt} and black for @var{whitept} will give color-inverted,
+normalized video. Shades of grey can be used to reduce the dynamic range
+(contrast). Specifying saturated colors here can create some interesting
+effects.
+
+@item smoothing
+The amount of temporal smoothing, expressed in seconds. the input range of
+each channel is smoothed using a rolling average over that many seconds of
+video. Defaults to 0.0 (no temporal smoothing).  The maximum is 60 seconds.
+
+@item independence
+Controls the ratio of independent (color shifting) channel normalization to
+linked (color preserving) normalization. 0.0 is fully linked, 1.0 is fully
+independent. Defaults to fully independent.
+
+@item strength
+Overall strength of the filter. 1.0 is full strength. 0.0 is a rather
+expensive no-op.
+
+@end table
+
+@subsection Examples
+
+Stretch video contrast to use the full dynamic range, with no temporal
+smoothing; may flicker depending on the source content:
+@example
+normalize=black:white:0
+@end example
+
+As above, but with 2 seconds of temporal smoothing; flicker should be
+reduced, depending on the source content:
+@example
+normalize=black:white:2
+@end example
+
+As above, but with hue-preserving linked channel normalization:
+@example
+normalize=black:white:2:1
+@end example
+
+As above, but with half strength:
+@example
+normalize=black:white:2:1:0.5
+@end example
+
+Map the darkest input color to red, the brightest input color to cyan:
+@example
+normalize=red:cyan
+@end example
+
 @section null

 Pass the video source unchanged to the output.
diff --git a/libavfilter/Makefile b/libavfilter/Makefile
index 8aa974e..31f8170 100644
--- a/libavfilter/Makefile
+++ b/libavfilter/Makefile
@@ -243,6 +243,7 @@ OBJS-$(CONFIG_NLMEANS_FILTER)+=
vf_nlmeans.o
 OBJS-$(CONFIG_NNEDI_FILTER)  += vf_nnedi.o
 OBJS-$(CONFIG_NOFORMAT_FILTER)   += vf_format.o
 OBJS-$(CONFIG_NOISE_FILTER)  += vf_noise.o
+OBJS-$(CONFIG_NORMALIZE_FILTER)  += vf_normalize.o
 OBJS-$(CONFIG_NULL_FILTER)   += vf_null.o
 OBJS-$(CONFIG_OCR_FILTER)+= vf_ocr.o
 OBJS-$(CONFIG_OCV_FILTER)+= vf_libopencv.o
diff --git a/libavfilter/allfilters.c b/libavfilter/allfilters.c
index 63e8672..af2287b 100644
--- a/libavfilter/allfilters.c
+++ b/libavfilter/allfilters.c
@@ -255,6 +255,7 @@ static void register_all(void)
 REGISTER_FILTER(NNEDI,  nnedi,  vf);
 REGISTER_FILTER(NOFORMAT,   noformat,   vf);
 REGISTER_FILTER(NOISE,  noise,  vf);
+REGISTER_FILTER(NORMALIZE,  normalize,  vf);
 REGISTER_FILTER(NULL,   null,   vf);
 REGISTER_FILTER(OCR,ocr,vf);
 REGISTER_FILTER(OCV

Re: [FFmpeg-devel] Confusion over temporal filters.

2017-09-10 Thread Richard Ling
Thanks Marton & Paul.
Flushing filter state after a seek makes good sense.  For my proposed
filter (video normalization, with auto gain control) flushing will give a
much more acceptable result than using unrelated frames.  For my use case
I'm only interested in transcoding complete files, so I always get the
exact expected result.  But maybe some time, the filter API could be
updated so that after a seek, a filter can request frames that are before
the seek position.
Thanks again for your answers.
R.

On 10 September 2017 at 17:17, Marton Balint  wrote:

>
>
> On Sun, 10 Sep 2017, Paul B Mahol wrote:
>
> On 9/10/17, Richard Ling  wrote:
>>
>>> I'm thinking of adding a temporal filter (one that relies on context from
>>> previous frames), and I've realised I'm a bit confused about how they
>>> should work.
>>>
>>> Say I open a file with ffplay and let it play up to frame 100.  Then I
>>> open
>>> the same file with another instance of ffplay and seek directly to frame
>>> 100.  It seems to me that frame 100 should look the same in both cases.
>>> Put another way, a specific set of filters, applied to a specific file,
>>> should completely define what every frame of that file looks like.
>>> Seeking
>>> around in the file should not change that.  Is that right (in principle)?
>>>
>>> So, looking at some of the existing temporal filters (eg. deflicker), I
>>> don't think that is what happens.  They filter based on the frames
>>> previously passed to the filter, and if the user seeks over a bunch of
>>> frames, the filter will see frames as consecutive that are not actually
>>> consecutive in the file, so it will give a different result.  Also,
>>> looking
>>> at the API, I can't see a way to get the behaviour I expect.  I can't
>>> see a
>>> way for a filter to ask its inputs for a frame from a different
>>> (specific)
>>> time.  Is that right?
>>>
>>> If my understanding is wrong, please let me know!
>>>
>>> If my undersanding is correct, then I guess my questions are:
>>> (1) is this behaviour a known issue (or a deliberate design choice)?
>>>
>>
>> Both. There is no seeking in lavfi (yet).
>>
>
> Actually ffplay re-creates the filter chain after every seek flushing any
> state of the filters, therefore non-consecuitve frames are not passed to
> the filters.
>
> Regards,
> Marton
>
> ___
> ffmpeg-devel mailing list
> ffmpeg-devel@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-devel
>
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-devel


[FFmpeg-devel] Confusion over temporal filters.

2017-09-09 Thread Richard Ling
I'm thinking of adding a temporal filter (one that relies on context from
previous frames), and I've realised I'm a bit confused about how they
should work.

Say I open a file with ffplay and let it play up to frame 100.  Then I open
the same file with another instance of ffplay and seek directly to frame
100.  It seems to me that frame 100 should look the same in both cases.
Put another way, a specific set of filters, applied to a specific file,
should completely define what every frame of that file looks like.  Seeking
around in the file should not change that.  Is that right (in principle)?

So, looking at some of the existing temporal filters (eg. deflicker), I
don't think that is what happens.  They filter based on the frames
previously passed to the filter, and if the user seeks over a bunch of
frames, the filter will see frames as consecutive that are not actually
consecutive in the file, so it will give a different result.  Also, looking
at the API, I can't see a way to get the behaviour I expect.  I can't see a
way for a filter to ask its inputs for a frame from a different (specific)
time.  Is that right?

If my understanding is wrong, please let me know!

If my undersanding is correct, then I guess my questions are:
(1) is this behaviour a known issue (or a deliberate design choice)?
(2) is it OK for new temporal filters to keep the same behaviour as
existing ones -- that is, they will give different results when seeking
happens, compared to sequential processing?  If so I'll just do the same
thing for my filter.

Thanks in advance for your thoughts.
R
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-devel