frames for me.
/Rickard
On Mon, Jul 24, 2023, at 06:48, Rickard Lindberg wrote:
Thanks for your support! This workaround works fine for me.
/Rickard
On Sun, Jul 23, 2023, at 00:22, Brian Matherly wrote:
> mlt-melt color:red in=0 out=25 -blank 25 color:blue in=0 out=25 -consumer
> av
guess is that I need to do something different in my code. Question is
what...
/Rickard
On Thu, Jul 20, 2023, at 22:16, Brian Matherly wrote:
Your MLT is over 2 years old. Can you try with a newer version?
Also, it would be helpful to reproduce with a melt command so that others can
easily rec
Here are some initial comments:
Since mlt_tractor_pass_properties only operates on frames, maybe it should be a
member of mlt_frame?
mlt_tractor_pass_properties should probably have a more descriptive name. Which
properties specifically does it pass?
Could we convert the function to a single
kard
On Thu, Jul 20, 2023, at 22:16, Brian Matherly wrote:
Your MLT is over 2 years old. Can you try with a newer version?
Also, it would be helpful to reproduce with a melt command so that others can
easily recreate the problem.
~Brian
On Thursday, July 20, 2023 at 03:00:45 PM CDT, Rickard
The cairoblend transition has an optimization that if the b frame is opaque,
it will never request the a frame
imagehttps://github.com/mltframework/mlt/blob/master/src/modules/frei0r/transition_frei0r.c#L61
When this optimization is triggered, get_frame is never called on the a frame.
As a
Your MLT is over 2 years old. Can you try with a newer version?
Also, it would be helpful to reproduce with a melt command so that others can
easily recreate the problem.
~Brian
On Thursday, July 20, 2023 at 03:00:45 PM CDT, Rickard Lindberg
wrote:
Hi,
I'm having trouble exporting a
If you want to disable SDL1, the packager can add "-DMOD_SDL1=OFF" to their
cmake call.This is how Shotcut does it:
https://github.com/mltframework/shotcut/blob/master/scripts/build-shotcut.sh#L766C161-L766C175
You should be asking the dependency packages if they are using the SDL1 module.
> Currently what I am thinking is of using swig for the code conversion that
>would change my java code to c/c++ code that mlt will use.
I hope you mean the other way around - make Java bindings for MLT so that you
Java application can use it.
MLT already has Java bindings using SWIG:
> 1. identify all YUV only services
> 2. convert some of these to also handle RGB(A)
I am willing to participate in some of these. In particular, if someone
identifies a service that is causing extra conversions, I can work on updating
it to support RGB/RGBA.
> 3. find suitable replacements
Thanks for investigating this topic. Here are some comments from me:
1) I support an attempt to reduce unnecessary image format conversions. Your
focus is on track compositing. But it would also be good if we could find a
general pattern that would work for filters as well. Some filters can
I think you will need to give more information about what you are trying to
accomplish. I do not understand why you would make a filter to double the
dimensions of the image.
In MLT, the consumer expects to receive the image size that it has requested.
The producer has normalize filters (like
g.pngwhere the videos are properly
stacked, but the profile for SDL (720p 16:9) is not honored as I would expect.
For sure this is due to my poor understanding of profile's behavior. I'd like
to apply the target profile at the very end of the process.
El vie, 7 ene 2022 a las 3:00, Brian Mat
I would suggest:
1) Create a producer to open V1 (it does not matter the profile)2) Query V1 for
the size3) Close V1 (delete the producer)4) Create a producer to open V2 (it
does not matter the profile)5) Query V2 for the size6) Close V2(delete the
producer)7) Calculate the profile that you
For this scenario (and any like it), I strongly recommend to not start with
code. Even for my own coding projects I do not start with code. Start with
Shotcut or Kdenlive and use the tool to visually achieve what you want. After
you get the result that you are looking for in the graphical
expect in accordance with
https://www.mltframework.org/plugins/FilterAffine/#transition.
Any clue about what might be going on?
El jue, 30 dic 2021 a las 19:53, José María García Pérez
() escribió:
I was expecting a rotated image. The output was the video unmodified.
El jue, 30 dic 2021 a las 1
> My first attempt wasn't very successful
What result did you expect, and what was the actual result?
On Thursday, December 30, 2021, 12:28:42 PM CST, José María García Pérez
wrote:
I wasn't aware about Shotcut doing so. Good to know.
My first attempt wasn't very successful. I did the
var pro:Producer = newFactoryProducer(p, null,
"./resources/pexels-pixabay-97082.jpg")
var pro:Producer = newFactoryProducer(p, null,
"./resources/pexels-pixabay-97082.svg")
The loader will figure out the best producer based on the file type.
The heuristic is
I think you have been reading the documentation, so I suppose you have already
seen these pages:https://www.mltframework.org/docs/scriptbindings/
https://www.mltframework.org/docs/codeexamples/
I think the melt CLI application is probably your best option for C examples.
I do not have any
> ```> Segment violated ('core' generated)> ```
Nothing jumps out at me. I would suggest to run it in GDB and post the stack
trace.
~Brian
On Tuesday, November 9, 2021, 10:31:28 AM CST, José María García Pérez
wrote:
I have the following example:```c#include #include #include
int
> Am I correct that MLT translates from these clock times to frame numbers and
>uses frame numbers for everything internally?
Yes. "frame number" is often referred to as "position" in the code. I think the
two terms are used interchangably.
> How does rounding work?
The code is
> How would you proceed?
I would go to the source and follow the
trail:https://sourceforge.net/p/sox/code/ci/master/tree/src/biquads.c#l412
Shotcut has a Bass/Treble filter that uses Ladspa. You could download Shotcut,
use the filter, and inspect the .mlt file to see how it works.
Patches to
> Hi,
> > I'm trying to create a mosaic from several video files. I'll need to add
>and remove videos over time and change the position and size of videos.> > 1.
>Can I use MLT XML for this?
Yes
> 2. Is it possible to create the mosaic and output to a file as fast as
> possible instead of
> the final film needs to be in 10-bit color
HDR is not in your requirement? If you are allowed to use 709 colorspace, then
I do not understand the 10bit requirement. MLT already supports RBG which is 24
bits per pixel. YUV 4:2:2 @ 10bits is only 20 bits per pixel. Maybe you could
use RGB
As a reminder, frame threading is when
abs(consumer.get_int("real_time")) > 1. There are some problems with it.
1. sometimes there is a crash
2. image artifacts due to race conditions
I think that the main reason for this is that we try to allow each
service instance to be running
at 04:40:54AM +, Brian Matherly wrote:
> Yes. I expect that would get you very far. Then, you could probably do a
>round trip as long as there are no conversions needed between producer and
>consumer.
Awesome. And this would be enough, right, for Edit Decision List-styl
, Brian Matherly wrote:
> > > Am I (at all) on the right track here?:
> > >
> > > Yes!
> > >
> > > This code seems to say the avformat producer supports rgba, rgb24, and
> > > yuv422p?
> > > https://github.com/mltframework/mlt
> > Am I (at all) on the right track here?:
> >
> > Yes!
> >
> > This code seems to say the avformat producer supports rgba, rgb24, and
> > yuv422p?
> > https://github.com/mltframework/mlt/blob/master/src/modules/avformat/producer_avformat.c#L614-L630
> >
> > This is a better place to look:
On Monday, November 9, 2020, 03:24:43 AM CST, amin...@mailbox.org
wrote:
On Sat, Nov 07, 2020 at 03:13:49PM +, Brian Matherly wrote:
> > Thanks.
> >
> > Is there an easy way to list (or find) which formats are supported --
> > and thus don't require c
The filter will not process the audio until you call
mlt_frame_get_audio()https://github.com/mltframework/mlt/blob/master/src/framework/mlt_frame.h#L127
Why not listen to the "consumer-frame-render" event from the consumer and
request the level of the frame after all the filters have been
> Thanks.
>
> Is there an easy way to list (or find) which formats are supported --
> and thus don't require conversions -- for inputs and outputs of
> filters/transitions?
>
> Tom
I think the only way would be to audit the source code and see what mlt image
format the service requests.
>> That is correct, but beware of automatically-added normalization filters
>> that are added in src/modules/core/loader.ini for things such as scaling
>> and padding, etc.
> ffmpeg has options like -auto_conversion_filters [0] and "-pix_fmt +" [1] to
> explicitly disallow accidental
This happens because you have not specified a profile - so MLT makes up a
profile based on your first producer.
https://www.mltframework.org/docs/profiles/
An easy way to learn about profiles is to create a project with a tool like
Kdenlive or Shotcut and then inspect the output.
On
for 10-bit color support in the next 6-12 months that sort of
stops things in their tracks for me and I'll need to figure out something else.
Tom
> El 28 jul 2020, a las 23:20, Brian Matherly
> escribió:
>
> It would be helpful if you could provide more detail about what you are
It would be helpful if you could provide more detail about what you are trying
to accomplish. What do you mean by "modules" and which ones do you need to use?
~Brian
On Tuesday, July 28, 2020, 08:11:52 AM CDT, amin...@gmail.com
wrote:
Back when it seemed like a no-brainer I remember
> On Wednesday, September 11, 2019, 12:35:03 AM CDT, Kingsley G. Morse Jr.
> wrote:
> > > Can you suggest a reasonable amount of RAM for a
> 64 bit computer, so it can edit videos with melt's
> framework and kdenlive?
>
> At the moment, I tend to
> 1.) render to 1920x1080 .webm
> 2.) use
On 8/3/2018 9:12 PM, Carl Karsten wrote:
analysis pass stored these values:
What are the 3 parameters?
$ grep results Welcome_to_CircuitPython.mlt
L: -26.787672R: 21.030387P
0.889130
L: -26.012146R: 20.896616P
0.865967
L: -27.243035R: 19.009469P
On 6/5/2018 8:17 PM, alcinos wrote:
Hello,
I'm currently facing the following problem. I have a producer that is
going to be encapsulated into a timewarp producer to adjust speed, and
I would like to know beforehand what will be the duration of the
timewarped producer.
To give a concrete
Sure the 3rd option is much simpler, and since MLT doesn't currently
support > 8bit buffers it won't make much difference in the result.
A simple patch is in my github's fork:
https://github.com/j-b-m/mlt/commit/d8c723130e34da7680b50b47bc31a927ef6414cd
It supports both
Following Kdenlive's bug report:
https://bugs.kde.org/show_bug.cgi?id=392294
I noticed that encoding to webm with the avformat consumer produces
files reporting an incorrect fps (1000fps). Some googling suggested to
enforce the fps on the stream, and the following patch produces files
that
On 3/26/2018 1:49 AM, Jean-Baptiste Mardelle wrote:
On 23.03.2018 18:42, Dan Dennedy wrote:
Do not forget to check libavutil version when adding more pixfmts.
One more question since I am not very familiar with the avformat /
alpha pipeline.
The AV_PIX_FMT_YUVA444P10LE uses 10bits for each
On 3/23/2018 2:47 AM, Jean-Baptiste Mardelle wrote:
Hi all,
In order to optimize performance, in the qtblend transition (in the qt
module), I try to detect if the top frame in the transition has
transparency, and if it doesn't, instead of performing the transition I
just return the top
Hello. I have been trying to resize the dynamic text filter with no
luck. I have tried many iterations of the following command and they all
fail to resize the text.. the text just stays massive. I have also tried
the size argument and was unsuccessful as well. Any pointers? I suspect
I am
If I understand correctly, I did as you asked.
$ gdb
(gdb) file kdenlive
(gdb) b mlt_pool.c:142
(gdb) r
Thread 28 "RenderThread" hit Breakpoint 1, pool_fetch (self=0x809c8920)
at mlt_pool.c:142
(gdb) print self->size
$1 = 33554432
I expect you're in a
I elicit a
[mlt_pool] out of memory
by using the mouse to move kdenlive's time line
pointer.
There are two ways this error can happen:
https://github.com/mltframework/mlt/blob/master/src/framework/mlt_pool.c#L140
Either:
* the pool is initialized with a size of 0
or
* mlt_alloc() has
On 9/6/2017 9:13 AM, Rafal Lalik wrote:
I would also add one other suggestion for your consideration: If you
don't necessarily need all the other features of the Kdenlive Titler in
conjunction with your typewritter effect, you might consider making the
typewriter its own stand-alone service in
On 9/5/2017 12:23 PM, Rafal Lalik wrote:
Please tell me your though about it and how would you like to proceed. I
can then eventually merge the code of derived classes in kdenlive into
basic classes and present as a pull request. The code for mlt actually
is already in its final state, could be
Hi David,
On 4/7/2017 3:18 PM, David Noble wrote:
> Hi
>
> I've got a couple 1080i25 sources[1], [2] that I'm trying to output to
> a 1080p50 avformat consumer, and I seem to be struggling!
>
> It seems that there's some interesting quirks in how mlt behaves
> contra how the yadif deinterlacer
Hello Dan, Brian.
Thank you for your answers. I've been struggling with this issue for a
while and I don't manage to get an almost constant video bitrate in
the output.
It seems I should use the --nal-hrd cbr option to "fill the mux" as
explained in post:
I need to get a true constant video bitrate in an UDP output address.
By using cbrts consumer I am getting a constant muxrate in the UDP
output, but not a constant video bitrate.
I have noticed that varying the gop size has an effect in the shape of
the output video bitrate curve but still I
On 3/13/2017 9:17 AM, alcinos wrote:
> Thank you for your answer.
> So if I understand correctly the logic of the code you've linked, if I
> create my objects (filter, consumers, producers) with a pointer to the
> same Profile object, then modifing properties of this profile object
> (fps, …)
On 2/11/2017 9:20 AM, Maksym Veremeyenko wrote:
> any objection against introducing *mlt_image_yuv422p10* ?
>
I am in full support of 422 and planar formats. Just wondering about the
bit packing.
What would be the corresponding AV format?
Would it be 16 bpp with 6 bits unused? BE or LE?
If
On 2/1/2017 8:33 AM, Patrick Matthäi wrote:
> Am 01.02.2017 um 15:07 schrieb Brian Matherly:
>> I can provide an update on this.
>>
>> With the 1.2.0 release of libebur128, the upstream library should be
>> compatible with MLT. The only thing that would be needed
r policy. This is a clear fork of
>>> libebur128. Let them figure out what they want to do about it. Debian
>>> can simply disable that module if they feel the need.
>>>
>>>
>>> On Fri, Jul 8, 2016, 7:06 PM Brian Matherly <c...@brianmatherly.com
>
. The global thread
pool would be timely for this.
~Brian
From: Dan Dennedy <d...@dennedy.org>
To: Brian Matherly <c...@brianmatherly.com>; Maksym Veremeyenko
<ve...@m1stereo.tv>; mlt-devel <mlt-devel@lists.sourceforge.net>
Sent: Sunday, January 29, 2017 3:18 PM
> Hi,
>
> i am currently thinking about capturing and playback 10-bit yuv, but
> have no idea how start handling of it.
>
> ffmpeg has no format for packed 16-bit Y/Cr/Cb samples - only planar.
> v210 is also good, but not used as registered in MLT.
>
> have you any ideas?
>
> --
> Maksym
I've not used the slices module nor studied it deeply. So here are some random
comments for your consideration.
* Who would be responsible for *first* initializing the global pool?
* Whoever calls it first "wins" by being able to decide the number of threads.
Maybe the number of threads should
I'm not sure about setting specific values. But would mpegts_start_pid work for
you?
https://mltframework.org/plugins/ConsumerAvformat/#mpegtsstartpid
~Brian
From: David Alonso Grande
To: mlt-devel@lists.sourceforge.net
Sent: Thursday, January 12, 2017 8:49 AM
Set "meta.attr.service_provider.markup" and "meta.attr.service_name.markup"
attributes on the avformat consumer.
Example:# melt input.mp4 -consumer avformat:output.ts
meta.attr.service_provider.markup=foo meta.attr.service_name.markup=bar
vcodec=libx264 f=mpegts muxrate=2000
Cheers,
~Brian
Depending on the producer, you might be able to query the image metadata and
determine the native format.
For example, the avformat_producer will often set the pix_fmt. Try to query
"meta.media.0.codec.pix_fmt" on a frame from the avformat producer widget.
Perhaps you could query the pix_fmt on
Hi JB,
I have an observation about this
line:https://github.com/mltframework/mlt/blob/master/src/modules/qt/transition_qtblend.cpp#L58
In QT, the image origin is in the upper
left:http://doc.qt.io/qt-5/coordsys.htmlIn MLT, the image origin is in the
bottom left.
So I wonder if the
JB,
I converted the common conversion functions to use QImage::Format_RGBA when
available:https://github.com/mltframework/mlt/commit/c9e5c6acd31cab2ed5155fe884e949a090bb931bThis
should also take care of the Qt version for you.
Please try it and make sure I didn't cause a regression.
Regards,
I think it would make sense to change the order: Qt5 first, then Qt4.
I do not think a "--force" option is necessary. User can manually specify Qt
using qt_libdir.
~Brian
From: Jean-Baptiste Mardelle
To: mlt-devel@lists.sourceforge.net
Sent: Tuesday, July 19, 2016
> Oh, right. I used it in my qtblend transition - than I guess I will enable
> it only on Qt5.
I would prefer to see copy_qimage_to_mlt_rgba() modified to use it with a
conditional to check QT version at compile time.
> I can confirm that you can draw text directly on the QImage
>
I think this would be a good optimization.
QImage::Format_RGBA is not available in Qt 4.8. So we would need some
pre-processor conditionals to use either the previous way or the new way
depending on version.
Are you sure that line order is the same between Qt and MLT image formats? For
some
JB,
I think this optimization could be made in frei0r.cairoblend.
Could you compare the performance of frei0r.cairoblend with transition_qtblend
when the top frame has transparency?
Seems like the transition_qtblend would be worth keeping regardless of whether
its performance is better.
~Brian
? Then, would the policy
no longer apply? After 6 months I am tired of waiting for libebur128 to accept
(or reject) the pull requests.
~Brian
From: Patrick Matthäi <pmatth...@debian.org>
To: Brian Matherly <c...@brianmatherly.com>; "mlt-devel@lists.sourceforge
tframework/mlt/commit/19b2fb8bc13ff561367d4bf6927ba7809f6b23dd
> Author: Brian Matherly <c...@brianmatherly.com>
> Date: 2016-03-02 (Wed, 02 Mar 2016)
>
> Changed paths:
> M src/modules/plus/Makefile
> M src/modules/plus/ebur128/ebur128.c
> M src/modules/plus/ebur128/ebur128.h
&g
ing to H.264, I only need to set "threads=4" and I can pretty
much saturate my CPU. Basically, one core ends up doing all the decoding and
MLT processing and the rest of the cores get used up for encoding.
~Brian
From: jeffrey k eliasen <j...@jke.net>
To: Brian Matherly <
Have you checked that you have dev libraries for gcc 4.8 installed and not some
other version?http://packages.ubuntu.com/xenial/libgcc-4.8-dev
I'm not sure which GCC is the default in Xenial, but it supports
many:http://packages.ubuntu.com/search?keywords=libgcc-=names=xenial=all
Looks like
> Building against external libebur is still on the todo list if I see it
> correctly?
Yes. Still waiting for them to process my pull requests.
~Brian
--
Find and fix application performance issues faster with
-Baptiste Mardelle <j...@kdenlive.org>
To: mlt-devel@lists.sourceforge.net
Sent: Saturday, April 16, 2016 12:03 PM
Subject: Re: [Mlt-devel] New motion tracker filter
On Saturday, April 16, 2016 12:03:24 AM CEST, Brian Matherly wrote:
Hi Brian,
Thanks for the feedback.
> Could you look
Cool filter.
Could you look at how we do 2-pass analyze/apply filters in vid.stab and
loudness and follow that
convention?https://github.com/mltframework/mlt/blob/master/src/modules/vid.stab/filter_vidstab.ymlhttps://github.com/mltframework/mlt/blob/master/src/modules/plus/filter_loudness.ymlIf
You need to use the melt wrapper script in the Shotcut.app directory which sets
up the proper library paths:
$ xvfb-run -a /app/Shotcut/Shotcut.app/melt
Without the wrapper script, the executable doesn't know where to look for the
libraries.
~BM
From: Gonzalo García Berrotarán
In the next day or two I will make a commit to catch our internal ebur128 to
the latest version. I am willing to administer a contribution patch that
provides an option to use internal or system ebur128. But the patch must use a
configure switch to enable system ebur128. The reason is that in
The "kdenlive_render" command from KDENLIVE is already a command line
application:https://quickgit.kde.org/?p=kdenlive.git=tree=b78350e3e97a7fd39ec8b866a3d8e3c2b3ebe6bb=991187b276cecb98d428b9d96255c66db21052bd=renderer
They call it by passing in an MLT XML file (src) along with some other
I have a theory that your colons might be confusing the loader. The loader
supports a feature were the service can be specified as part of the argument
with a colon delimiter:
https://github.com/mltframework/mlt/blob/master/src/modules/core/producer_loader.c#L45
Perhaps your colon is getting
they work.
It can take a little trial and error to get used to the syntax. So dig in and
have fun!
~Brian
On 11/30/15, Brian Matherly <c...@brianmatherly.com> wrote:
> I think you want to look into MLT transitions.
> ~Brian
Thanks, Brian. I checked
http://www.mltframework.org/
I think you want to look into MLT transitions.
~Brian
From: Zenny
To: mlt-devel@lists.sourceforge.net
Sent: Wednesday, November 25, 2015 2:04 AM
Subject: [Mlt-devel] Equivalent of ffmpeg's --filter-complex in melt?
Hi,
Is there an equivalent of ffmpeg's
"meta.attr.1.stream.creation_time.markup"Comes from the metadata dictionary
connected to the AVStream structure associated with each stream (audio, video,
etc). May or may not be present for any given stream.
"meta.attr.creation_time.markup"
Comes from the metadata dictionary connected to the
Fixed in
git:https://github.com/mltframework/mlt/commit/97c2dd0de4f578ad40d547eddf78fcb1e4a008a4
From: Patrick Matthäi
To: MLT Mailinglist
Sent: Wednesday, November 4, 2015 11:40 AM
Subject: [Mlt-devel] Fwd: Bug#803841: mlt:
You can use the melt program. But why not use Audacity or some other audio
editor?
From: Axel Rousseau
To: mlt-devel@lists.sourceforge.net
Sent: Thursday, October 8, 2015 10:15 AM
Subject: [Mlt-devel] How could I assembly sound files ?
Hi,I would
You make a good point. The same result can be achieved by applying the affine
transition. Instead of putting the rotation responsibility on the pango
producer, why not allow the watermark filter to use the affine transition?
From: Dan Dennedy d...@dennedy.org
To: Brian Matherly c
Personally, I would like to see this implemented in the qtext producer also
since we try to keep qtext and pango producers in sync.
Also, I would suggest angle or rotation instead of rot_angle.
~BM
From: Maksym Veremeyenko ve...@m1stereo.tv
To: mlt-devel@lists.sourceforge.net
Cc: Dan
http://www.mltframework.org/bin/view/MLT/Questions#Does_MLT_take_advantage_of_multi
Instead of real_time=0, try real_time=-1. On your 8-core machine, you might try
real_time=-8. The negative number indicates that frame dropping is disabled.
The real_time property changes the MLT consumer
I'm not sure the best way for Mac.
It does come bundled with Shotcut. So maybe that would work for you. Download
Shotcut, and then just use the bundled mlt
binary:http://www.shotcut.org/bin/view/Shotcut/Download
~Brian
From: Nisar Ahmed nisar@gmail.com
To: Brian Matherly c
You need to tell the dynamic text filter what text you want to display:
melt -verbose -profile dv_pal /Volumes/XSAN/test000.mov -attach
dynamictext:#timecode# -consumer decklink
~BM
From: Nisar Ahmed nisar@gmail.com
To: Dan Dennedy d...@dennedy.org
Cc: mlt-devel
I don't see any evidence in your output that suggests MLT is crashing. The
[mpegts...] messages are from ffmpeg and are beneign. The rest of the
messages are from KDNELIVE.
~Brian
From: Zenny garbytr...@gmail.com
To: mlt-devel@lists.sourceforge.net
Sent: Saturday, April 11, 2015 7:19
Commit: 15f397deea19ded69bd068d6b7eb9be9d7b27806
https://github.com/mltframework/mlt/commit/15f397deea19ded69bd068d6b7eb9be9d7b27806
Author: Brian Matherly c...@brianmatherly.com
Date: 2015-02-25 (Wed, 25 Feb 2015)
Changed paths:
M src/modules/sdl/consumer_sdl_audio.c
Log
it - and let me know if you run into anything unexpected.
~Brian
From: Dan Dennedy d...@dennedy.org
To: Brian Matherly c...@brianmatherly.com
Cc: mlt-devel@lists.sourceforge.net mlt-devel@lists.sourceforge.net
Sent: Wednesday, February 25, 2015 3:41 PM
Subject: Re: [Mlt-devel
Looks like you are trying to run the script in a directory that already has
some really old sources. We have to switch git repos from time to time for
various reasons. Clean out the old source folders:
# rm -Rf *# wget
http://github.com/mltframework/mlt-scripts/raw/master/build/build-melt.sh
#
The motion_est module is GPL. You have to enable GPL modules when you configure.
Try:#./configure --enable-gpl
You can run#./configure --helpto see all the build options.
After the build run# melt -query filtersto see the list of filters that were
built. If motion_est is not in the list, then it
I would do it like this:
producer1producer2producer3producer4playlist1 producer1 blank producer3
blank
playlist2
blank
producer2 blank producer4tractor playlist1 playlist2 transition1
transition2 transition3
The key will be setting the length of the blank entries and the in/out
Hi Jeremy,
Looking at the list of filters it looks like luma is the only one that will
take the current and next frame as input. Is there any way to accomplish the
same thing with other transitions, like composite or affine (maybe using
the transition filter somehow?).
Thanks!Jeremy
Can you
Hello guys.
First of all merry Christmas :-)
Then Does Someone knows if raspbian (debian compiler forma raspberry pi)
has Support for mlt
framework?
Does it Support hardware accelerated decode (codecs are proprietary) forma
mpeg2 andò h264?
Is possibile To compile (make switches)
...
Branch: refs/heads/master
Home: https://github.com/mltframework/mlt
Commit: 629db075398167cbab6106cd73dd228b22018f25
https://github.com/mltframework/mlt/commit/629db075398167cbab6106cd73dd228b22018f25
Author: Brian Matherly pez4br...@yahoo.com
Date: 2014-12-05 (Fri, 05 Dec 2014
Thanks, Brian. A minor nit: in yml, instead of default: unset you
should simply omit that line unless there is actually some special
value unset for that property as in rgblut. I am noticing a trend
towards that unset when the intention was that no default means
exactly that.
Great
Dan,
The level parameter is relatively new. How would you feel about
breaking backwards compatibility and converting it to dB? I'm
willing to do the work.
I think that will affect the current Flowblade release. Let's
see what Janne says. Otherwise, I do not have a problem with
the
Pascal,
Downloaded the latest version of shotcut and the included melt works just as
expected !
one more question, I'm animating the audio level with this kind if string :
-filter volume level=0=0;100=1;150=1;300=0
I wondering if this is the good way to do it and also what are the
units
Pascal,
You are absolutely correct, I was wondering why the the sound seems
to go up much more rapidely than I expect.
is it possible to indicate Db instead of just a factor ?
It is not currently possible. But in my opinion, it would make sense to change
the behavior of the level parameter
Brian is working on a filter manager; so people not
interested in the NDVI filter do not need to mentally
skip over it every time they access the filter menu.
And I am currently working on the ability to let a
filter provide a QML UI that overlays the video area.
The filter manager is
1 - 100 of 214 matches
Mail list logo