y4mshift (use -h for documentation). You don't need to upsample to 444,
but it will give you finer control over the chroma shifts. Since the
luma alone can be moved by individual pixels (using -y and -Y) you still
have fine control over the differential shift with subsampled formats.
Dan
On Sa
On Thu, 2009-06-04 at 16:04 +1000, Richard Archer wrote:
> At 3:41 PM +0200 3/6/09, Hervé wrote:
>
> >hello, I'm not developper but it could not be a buffer concern? (it's
> >just an idea)
>
> Following this hint, I doubled the buffer sizes allocated
> by y4mstabilizer and it now works! I have no
I recently took some really unsteady video (walking on sand, partially
zoomed in; we're talking the bridge of the Enterprise after a direct
hit shaky) and decided to test the limits of y4mstabilizer. It seems
to be up to the task in theory (if one doesn't mind much of a given
frame being off-scree
On Tue, 2006-04-11 at 01:26 +0200, Nicolas wrote:
> Do you know of any tool I could use in an mjpeg pipe to correct
> horizontal ("color Bleed") and/or vertical ("color Droop") chromashifts?
Use y4mshift with the -y/-Y options to shift luma and chroma
independently.
Dan
--
On Wed, 2005-03-09 at 09:35 +1300, E.Chalaron wrote:
> Now another thing, at the risk of appearing completely dumb, what is the
> purpose of y4mspatialfilter ? is it a convolution filter ? if so is there a
> way I can cut off frequencies to avoid the Nyquist effect ?
y4mspatialfilter performs
On Sun, 2005-02-20 at 08:41 -0800, Steven Boswell II wrote:
> Some time ago, there was a discussion on 4:1:1 chroma
> subsampling in DV files of 3-2-pulldown sources, and
> how the color needed a special line-switch in order to
> be completely accurate. (Lines 2 and 3 of every group
> of 4 lines
operly
center them, was lots of standalone DVD player video glitches and
lockups. Cured every time by reburning sans label.
Dan Scholnik
---
This SF.Net email is sponsored by BEA Weblogic Workshop
FREE Java Enterprise J2EE developer tools!
Get
On Wed, 2004-05-26 at 12:18, Steven M. Schultz wrote:
> On Wed, 26 May 2004, Dan Scholnik wrote:
>
> > Doesn't the ' in Y' indicate that the digital data has been
> > gamma-corrected to compensate for the nonlinear CRT response? In that
>
> I'd
ing noise usually dominates and dark frame subtraction would
only make things worse. At best you could subtract out the mean if
"black" has a constant bias. Besides, since dark noise doesn't change
quickly over time it wouldn't lead to noisy backgrounds, just
nonuniform
On Sat, 2004-05-22 at 01:56, Steven M. Schultz wrote:
> On Sat, 22 May 2004, Dan Scholnik wrote:
>
> > > > You might even try running y4mspatialfilter before and after y4mdenoise
> > > > in case the latter introduces any high-frequency artifacts.
>
> >
On Sat, 2004-05-22 at 00:50, Steven M. Schultz wrote:
> On Mon, 10 May 2004, Dan Scholnik wrote:
>
> > You might even try running y4mspatialfilter before and after y4mdenoise
> > in case the latter introduces any high-frequency artifacts.
>
> Ok - this I have done.
set of values is "-l 1 -t 4","-l 2 -t 6", and "-l 3 -t 8"
for light, medium, and heavy filtering.
Dan Scholnik
---
This SF.Net email is sponsored by Sleepycat Software
Learn developer strategies Cisco, Motorola, E
er has this feature), try
this:
yuvcorrect -Y Y_1.0_16_255_0_239
which just shifts the whole image down by 16 luminance values.
Dan Scholnik
---
This SF.net email is sponsored by: The Robotic Monkeys at ThinkGeek
For a limited time only, ge
On Sat, 2004-04-24 at 22:38, Steven Boswell wrote:
> Andras Kadinger, fellow mjpeg-developer subscriber, was nice enough to
> agree to host this. The web page contains 2 movie clips that pretty
> dramatically show the results of using the denoiser vs. not using it.
I can see the difference when
om frame n followed by 1,3,5
from frame n+1. That's how they will be displayed on your TV. Any
other choice would introduce a mismatch in either time or space, albeit
small. I would be curious if you can tell the difference though.
Dan Scholnik
---
size=704x480 -O Xscale=15:16 -O Yscale=14:15
The result is a black border around the active frame. Note the aspect
ratio is changed slightly (but imperceptibly, by a factor of 225/224)
in order that the scaled size be convenient. I think in the current
version of y4mscaler "s
> (Hmm... I wonder if there is something beneficial to the cubic-esque gradual
> taper, versus blurring/"noise-reduction" via lowering the cutoff frequency
> of an ideally sharp low-pass filter Just musing to myself.)
In terms of visual appeal, that's probably an empirical question. But
> > about the speed can substitute their own convolve1D(). I've not
> > tested this for its effect on encoded bitrate, so I don't even know if
> > it is useful. Anyway, enjoy.
>
> How's the resulting encoded video look?
Depends on the source. Low quality stuff (like VHS capture) is
alre
about the speed can substitute their own convolve1D(). I've not
tested this for its effect on encoded bitrate, so I don't even know if
it is useful. Anyway, enjoy.
Dan Scholnik
/* y4mspatialfilter.c
* written by Dan Scholnik <[EMAIL PROTECTED]>
* spatial FIR filter for no
> > But it would be nice to have a spatial lowpass filter in the
> > toolbox also to reduce noise that way.
>
> What about yuvmedianfilter? Is it a spatial lowpass filter?
It's a nonlinear spatial filter that generally gives an overall
lowpass effect. Ideally it removes noise without attenuati
how many taps to use - should one
> use 4,6 or 8?) - at least that's been my observation.
Sure, because they preserve the high spatial frequencies better
(including the noise that lies there).
Dan Scholnik
---
The SF.Net email is sponso
> Some 4:3 TVs however, have a 16:9 enhanced mode. In this mode, they take the
> full vertical resolution signal and squish the scan lines into a 16:9
> letterbox area. This will give you more quality than throwing away the
> resolution before you encode.
Do such TVs have enough vertical pixels
> That is what I thought. When encoding a widescreen movie with aspect ratio
> 16:9, black borders are not needed to expand the video frame, as is needed
> when encoding to a 4:3 aspect ratio. The aspect ratio will be preserved.
> Therefore all the bits in the encoded video are used in the origina
> If people think that yuvdenoise and mpeg2enc are "more than ready" for a
> stable release, I'll package a 1.6.1.91... Else, I'll wait a few days
> longer. ;).
>
> Ronald
There was a problem reported a while back with post-1.6.1 yuvdenoise
(that is, after my 4:1:1 patches) producing some visual
> > yuvdenoise has some form of sharpening available with the -S
> > parameter, you might try that.
>
> I did, and I believe that the unsharp mask is more efficient. From what I
> understand it is more or less a way or improving the ratio noise / signal.
> But I might be wrong on this one...
yuvdenoise has some form of sharpening available with the -S
parameter, you might try that. In general, the problem with
sharpening video is that it increases the high frequencies (edges) and
in doing so raises the noise level. This will most likely raise the
bitrate, possibly substantially.
Da
> I was wondering if I could ask for advice on high quality DV capture.
>
> I am using:
> - 1.6G Athlon system
> - standard RH9.0
> - latest dvgrab code
> - IEEE 1394 card (TSB12LV26 using TI chipset with ohci1394 driver)
> - SCSI (IBM DDYS-T18350N drive, AIC-7892A U160/m card)
>
> Even with my
> I've been using yuvkineco/mpeg2enc for pulldown. It stays perfectly in sync until
> it reaches around 100 minutes into the video. Then it gets WAY out of sync. It
> isn't slowly getting out of sync at 100, it just suddenly gets out of sync.
>
> Any ideas?
Use "yuvkineco -C frames.lst" in y
> *nod* This does not seem to be the same problem as you experienced with
> your camera, although some of the effect is similar.
There are evidently several problems that have similar looking
visual effects.I think Dan Scholnik's camcorder Y/C
The one big problem that I do have with the toolset is that I get
ghosting in very dark scenes, to the point of making some stuff *very*
uncomfortable to watch -- like seeing it through a heavy, dark fog or
something.
I have experienced the same or very similar eff
> From: [EMAIL PROTECTED]
> I use netpipes instead of rsh, but either way the network shouldn't be
I gave netpipes a try - worked "ok" but I couldn't find a way
to pass parameters thru to the remote process. Easy to set up
a shell
Question is, has anyone tried to "rsh" some jobs out to other
computers (e.g. on a 100Mb ethernet) to try to shorten the time,
or would that just mean the network becomes the bottleneck?
I use netpipes instead of rsh, but either way the network shouldn't be
much of a bo
>Is there an existing utility out there that can provide a delay
>between the luma and chroma channels? My camcorder produces video
>with the chroma lagging the luma, with the amount of lag dependent
>(for reasons unknown to me) on the light level. It's really
Can yuvdenoise cope with an interlaced source (VHS tape) and keep it
interlaced. Only it says:
"and then into the filter. (Well, the other way would work, too,
except that the filter doesn't do interlaced MC yet and doesn't do a
good job on interlaced
I toggled the progressive sequence bit to zero on every system header
and the resulting stream now plays correctly with mplayer! Haven't yet
checked on the Poineer standalone.
Has anyone seen this? I believe dvd is not supposed to be marked as
progressiv
But if there is a Windows version, it would be nice (well, it must) be
possible to get hold of the code that generated the Windows binaries,
right.
Since I have a Win2k box at home (for the wife), I decided to try to
use it to offload some of my processing. I installed t
Out of curiosity how much is the data "damaged" by a conversion from
4:1:1 to RGB and from there to 4:2:0?
Or really how much more damage than is already done going from 4:1:1
to 4:2:0, which really results in the (quality) equivalent of 4:1:0
(if such a beast ex
> What I'd really like is a tool that will scan the clip (or part of it)
> and automatically correct white balance and/or maximize contrast.
> Hand tweaking of yuvcorrect parameters is tedious.
Have you tried yuvcorrect_tune? It's a bit tricky to get going
Since the first patch really only changes the sequence header
prog_seq bit the desync bug will probably only show on some
decoders. If the decoder turns off pulldown based on this bit and
the frame rate (wrong!) it will lose sync. If it relies instead on
th
Does anyone on this list know how I can change the hue of a video
inside of the encoding pipeline? I made a recording with my
camcorder and I forgot to do the proper white-balancing prior to
the recording. As a consequence, I now have a DV file on my
compu
Ok, I found the bug. Seems that when MPEG_FORMAT_DVD split into
MPEG_FORMAT_DVD and MPEG_FORMAT_DVD_NAV, it didn't propagate
everywhere. Trivial patch (do I do any other kind?) enclosed.
Dan
--- mpeg2enc.cc.new 2003-03-18 16:16:14.0 -0500
+++ mpeg2enc.cc 2003-03-21 14:24:03.000
Is anyone successfully using the CVS version of mpeg2enc to encode
23.976 fps progressive material with the 3:2 pulldown flag for
playback on a 29.97 fps NTSC tv? If so, can you post your mpeg2enc
flags?
I'm having fits getting this to work post-1.6.1. I've tried both
"mpeg2enc -I 0 -p -F 4" an
Based on the past couple days of encoding the one knob I really
really want to tweak is the selecting blurring one - high motion
scenes could do with a bit of selective blurring but only the
encoder knows when that's needed...
Is th
To fulfill my need to tweak every knob, I made a trivial patch to
mpeg2enc.cc to allow the amount by which the quantization for
high-frequency components is increased to be specified on the command
line. I couldn't seem to get optional arguments to work with getopt,
so I resorted to adding --hf-b
You have to be working the graveyard shift... seems about the time
I'm set to churn in for the eve you pop up ;)
Something like that. I'm working pretty funny hours lately, mostly at
home (seems prudent, given that work is a gov't lab in DC). It's not
unrelated t
Ok, I give up. What is "bfr"? I don't seem to have that one.
Dan
---
This SF.net email is sponsored by:Crypto Challenge is now open!
Get cracking and register here for some mind boggling fun and
the chance of winning an Apple iPod:
http://
Well, I tricked y4mscaler into doing what I want, but the results were
a bit discouraging. To get a horizontal lowpass spatial filter with
cutoff N/D<1 I just did a downsample by N/D followed by an upsample by
N/D (using the sinc8lan kernel). What I found was that (to my eye)
there was no visibl
For me, scalers are only an occasional diversion from my daily grind
For what it's worth, a 'blur factor' option is on my list of things to
add to y4mscaler; I imagine it would simply scale the size of the kernel
to lower the spatial cut-off frequency.
Right
--reduce-hf doesn't actually throw away the higher frequency DCT bins.
What it does is simply increase their quantisation (which of course means
more low-amplitude ones get 0-ed).
Is it easy to make the amount of the increase and the transition
adjustable? To give yet ano
One tiny and one crippling. Yes, it was the right version but
I goofed up on the second 'y/u/v' assignment, can't believe I did
something *that* dumb.
The effect was cool though - diagonal black lines across the screen.
The i
I found a couple of tiny (but crippling) y4mblackfix bugs. Did you check
in the right version? Seems to work now.
Dan
--- y4mblackfix.c.orig 2003-03-12 12:44:53.0 -0500
+++ y4mblackfix.c 2003-03-12 23:13:43.0 -0500
@@ -174,9 +174,9 @@
* process.
*/
mjp
It is a low volume group for the most part - would it be feasible
to use the individual article form I wonder?
Done. Bad enough I did it twice in a row, but I know me, and I'd do
it again if I stayed on digest. Now if others wouldn't rub it in by
replying with the same subject l
> From: [EMAIL PROTECTED]
> Subject: [Mjpeg-users] Re: [Possible SPAM] Mjpeg-users digest, Vol 1 #888 - 1\5 msgs
Interesting subject - wonder where that came from?
It came from hitting "r" on a digest email and f
I think extent of the ringing artifacts depend on the source images
being encoded. If the spatial bandwidth of the signal hitting
mpeg2enc has already been limited (by camera optics, preprocessing,
etc) then at higher spatial frequencies the spectrum will already roll
off, and will probably be le
I came up with a simple (i.e. very dumb, perhaps too much so) program
to generate a histogram of a Y4MPEG2 frame. Using the program on
Have you tried yuvcorrect -M STAT? It makes frame-by-frame histograms
in YCbCr and RGB. I used it a little earlier to investigate the range
of my camer
So if for example you had an image of a building against a blue sky, if
you looked closely next to the edge of the building there would be ghosts
of the building edge adjacent to the edge in the playback image of the
blue sky. Credit lettering and station ID bugs a
t till now, and was very surprised that it
lowers the bitrate so much yet I'm not sure I could tell in a double
blind test - certainly NOT the case for yuvdenoise and
yuvmedianfilter. Although my camera is so noisy in low light that I
need yuvdenoise just to make it look presen
I decided to compare using the --reduce-hf switch of mpeg2enc to other
methods for reducing noise and the bitrate, after having ignored it
previously. I was pleasantly surprised. To take an example, I
converted noisy DV footage to DVD format with -q 5 and -b 8500:
no denoising:
> softens the picture a lot. Median filtering in general is great for
> noise distributions with long tails (non-gaussian, impulsive noise)
> since it can exclude large outliers rather than averaging them in like
> linear filtering, but the flip-side seems to be that edge detail gets
> (bad) partial deinterlace. The output looks jerky and it doesn't save
> any bits either.
H, I haven't noticed the jerkiness. Seemed to save some bits
but perhaps not as many as it could.
I never noticed before, but on a high-motion scene play
Well, that's what I get for reading the digest version of the list. I
figured nobody had a chance to get to it so soon. The new CVS version
works fine, my patch just moved the bypass outside the filter routine
altogether, and makes only 3 calls to memcpy. Possibly a little
faster, but I'm sure
Ok, it was so easy I did it myself. Here are patches for the CVS
yuvmedianfilter.c and yuvmedianfilter.1 to mention the -I interlace
flag and to handle -t 0 and/or -T 0 with -I.
Dan
--- yuvmedianfilter.c.orig 2003-02-24 23:09:49.0 -0500
+++ yuvmedianfilter.c 2003-03-04 00:10:14.
ssing in the filter. It should be a simple fix, if nobody else
does it I'll send a patch once I get around to it.
--
Dan Scholnik
[EMAIL PROTECTED]
---
This SF.net email is sponsored by: Etnus, makers of TotalView, The debugger
for co
The >235 Y' values aren't uncommon --- I've seen that referred to as
"Superwhite". These values don't map to anything in R'G'B', though,
and need to be clipped if working in R'G'B'. Otherwise, consider it
Actually, my current denoise color-correction uses RGB, so I gue
>I understand that the lowest "legal" Y' value is 16. I assume it comes
>right off my camera in the range 16-235? But the libdv and kino
Yep -- off the camera in 16-235 range.
Ok, I decided to test this out, and found some surprises. I have a
Canon ZR40, a nothing-spe
> NTSC Setup/Pedestal
> ===
> The decoder's add_ntsc_setup option should only be used
> by North American NTSC users when viewing the video on your computer
> monitor. It should never be used when transcoding, image processing,
The "+16" is not NTSC setup. It is simply footroom in the Y'CbCr encoding
scheme for digital pixel data. NTSC setup is part of the spec for *analog*
transmission.
I understand that the lowest "legal" Y' value is 16. I assume it comes
right off my camera in the range 16
Could someone explain exactly when and where the NTSC setup (Y'+16)
should be added?
This much I think I know:
For NTSC DV, it is supposed to be added at the final analog output,
say at the camera when output to TV. To get the same effect when
playing on the computer, it needs to be added digit
The binary works fine, but when compiling I get:
g++ -DYS_VERSION_MAJOR=0 -DYS_VERSION_MINOR=4 -DYS_VERSION_PATCH=0 -O2 -march=i686
-mcpu=i686 -I/usr/local/include/mjpegtools -Wall -W -c -o graphics.o graphics.C
graphics.C: In member function `int ysRegion::parse_geometry(const char*)':
graph
I switched to the CVS version of mpeg2enc, and while transcoding a
NTSC mpeg2 w/ pulldown (via transcode) I got some funny results. The
.m2v seemed ok (mplayer reports the correct frame rate), but the
resulting multiplexed .mpg has major sync problems - the video
gradually falls behind, and mplay
>> Hmmm, I wonder if y4mscaler can be used simply to do the 411 -> 420
>> conversion even in the situation where no scaling is being requested.
>You bet --- it is all just scaling after all:
> The chroma planes are being scaled by (2/1, 1/2) and the luma plane is
> get
The problems I'm seeing don't seem like just problems of light and
dark gray winding up white and black. I'm having white map to pale
green. I suppose this could result from quantization/clipping
somehow, since R, G, and B do not have equal weight. I'll have to do
some histogramming to see if t
e green tint if I use a commercial DVD as my movie source instead.
So, what's the deal? Is this an actual error in
yuvdenoise/medianfilter, or an expected artifact from denoising the
color info? Am I the only one who sees this? I see it both in 1.6
one who sees this? I see it both in 1.6.x
and in CVS.
--
Dan Scholnik
[EMAIL PROTECTED]
---
This SF.NET email is sponsored by:
SourceForge Enterprise E
74 matches
Mail list logo