Re: [FFmpeg-user] Android NDK r18b support

2018-12-06 Thread Davood Falahati
>
>
> > I'm using the following steps to cross-compile ffmpeg
>
Follow this:
https://github.com/falahati1987/ffmpeg4Android

Hope it would help


- Davood Falahati

>
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] probleme lors de la compilation de ffmpeg

2018-03-07 Thread Davood Falahati
Cher Chriistophe,

> lors de la compilation de FFMPEG, nous avons des erreurs bloquantes.

La langue officielle est l'anglais. Mais, pourqoui vous n'utilisez pas les
pre-built FFMPEG executables?

-Davood Falahati
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Autorotation issues - portrait orientation screen

2018-01-07 Thread Davood Falahati
>> So what I want is:
>>Turn off the autorotation for files with

>>rotation flag


It's been a long time since ffmpeg automatically applies display_matrix
side data to the video stream. Have you tried -noautorotate option in your
script?


Thank you,
Davood Falahati
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] multithreaded use of sws_scale

2017-09-14 Thread Davood Falahati
I think this question is out of scope of ffmpeg community. You should ask
it in libavuser instead.
By the way, if you want to scale an AVFrame in various dimensions in
different threads, you should instantiate various amounts of SwsContext.

__

Tel: +98 (913) 126-0265

On Thu, Sep 14, 2017 at 10:44 PM, Martin Belleau <
mbell...@capellasystems.net> wrote:

> I see that sws_scale has srcSliceY and srcSliceH, which should allow to
> perform scaling of different slices in parallel.
>
> Is there any sample code which demonstrates how to use it?
>
> Should the same SwsContext be used to scale all the slices?
>
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

[FFmpeg-user] FFMPEG sample example does not work!

2017-06-27 Thread Davood Falahati
Dear community,

I am doing a video processing project with opencv and I have to read and
write video/audio streams with ffmpeg. I found the ffmpeg-output-example
useful, but I had a tough time to update it (I don't know the code is not
updated while some parts of the code is deprecated!). I have uploaded my
working code in:
https://github.com/falahati1987/video/blob/master/ffmpeg-example.cpp

When I run the code, I get the following error:

[libx264 @ 0xee2400] broken ffmpeg default settings detected
[libx264 @ 0xee2400] use an encoding preset (e.g. -vpre medium)
[libx264 @ 0xee2400] preset usage: -vpre  -vpre 
[libx264 @ 0xee2400] speed presets are listed in x264 --help
[libx264 @ 0xee2400] profile is optional; x264 defaults to high

My sample video specifications is :

Output #0, mov, to '/home/dfalahati/Videos/melissa.MOV':
Stream #0:0: Video: h264, yuv420p, 352x288, q=2-31, 400 kb/s, 25 tbc
Stream #0:1: Audio: aac, 44100 Hz, 2 channels, 64 kb/s

And I am using ffmpeg 2.8.11 on Ubuntu 16.04. I have not compiled ffmpeg
from source.

can anybody help me?

Best,
Davood Falahati
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Re: [FFmpeg-user] Configure error with Android NDK clang compiler

2016-01-15 Thread Davood Falahati
I have not compiled FFMPEG for android using toolchain, however, I had a
tough time doing the same with opencv-JNI. This question, asked in
stackoverflow might help you.
http://stackoverflow.com/questions/34246265/opencv-ann-mlp-training-in-android

I read your ndk building procedure, don't you point Application.mk and
Android.mk files to your toolchain? I mean $NDK-PATH/ndk-build ? Are you
using gradle?


Davood Falahati,

Isfahan University of Technology.
d.falahati.1...@ieee.org

Every 3000 Sheets of paper costs us a tree.. Save trees... Conserve
Trees. Don't print this email or any files unless you really need to!

On Fri, Jan 15, 2016 at 2:21 AM, Marco Pracucci <marco.pracu...@spreaker.com
> wrote:

> I successfully build and run in production ffmpeg on Android (arm and x86)
> compiled with Android NDK's gcc. Recently, Android deprecated gcc and is
> pushing every dev to migrate to clang.
>
> I'm getting some issues while compiling ffmpeg with Android NDK's clang and
> this is the first one I encounter:
> *GNU assembler not found, install/update gas-preprocessor*
>
> Despite the message, looks that error is more subtle, because it fails
> while running "check_as" but if I run the same exact check command on the
> terminal, *it works*.
>
> Below you can find the full instructions to reproduce it (it fixes with
> --disable-asm but with gcc I'm able to compile with asm support).
>
> Versions:
> - ffmpeg 2.8.4
> - Compile on OSX 10.9.5
>
> *Download Android NDK*
> You can download the Android NDK from here:
> ​http://developer.android.com/ndk/downloads/index.html
>
> *Install Android NDK*
> Follow the installation instructions in the download and then set the
> environment variable NDK_DIR to the directory where you extracted the NDK.
>
> *Build NDK standalone toolchain*
> Define an environment variable TOOLCHAIN_DIR with the full path of the
> directory where the standalone toolchain should be installed. For example:
> export TOOLCHAIN_DIR=/tmp/toolchain
>
> Then build the toolchain:
> ${NDK_DIR}/build/tools/make-standalone-toolchain.sh \
> --toolchain=arm-linux-androideabi-clang3.6 \
> --platform=android-9 \
> --install-dir=${TOOLCHAIN_DIR}
>
> *Compile ffmpeg*
> I use the following configure script:
> ./configure \
> --enable-shared \
> --disable-static \
> --enable-gpl \
> --enable-version3 \
> --enable-nonfree \
> --disable-runtime-cpudetect \
> --disable-all \
> --disable-doc \
> --enable-avcodec \
> --enable-avformat \
> --enable-avutil \
> --enable-swresample \
> --disable-w32threads \
> --disable-os2threads \
> --disable-network \
> --disable-dxva2 \
> --disable-vaapi \
> --disable-vda \
> --disable-vdpau \
> --enable-protocol="file" \
> --enable-decoder="aac" \
> --enable-decoder="cook" \
> --enable-decoder="flac" \
> --enable-decoder="mp3" \
> --enable-decoder="mp3adu" \
> --enable-decoder="mp3adufloat" \
> --enable-decoder="mp3float" \
> --enable-decoder="mp3on4" \
> --enable-decoder="mp3on4float" \
> --enable-decoder="pcm_alaw" \
> --enable-decoder="pcm_bluray" \
> --enable-decoder="pcm_dvd" \
> --enable-decoder="pcm_f32be" \
> --enable-decoder="pcm_f32le" \
> --enable-decoder="pcm_f64be" \
> --enable-decoder="pcm_f64le" \
> --enable-decoder="pcm_lxf" \
> --enable-decoder="pcm_mulaw" \
> --enable-decoder="pcm_s16be" \
> --enable-decoder="pcm_s16be_planar" \
> --enable-decoder="pcm_s16le" \
> --enable-decoder="pcm_s16le_planar" \
> --enable-decoder="pcm_s24be" \
> --enable-decoder="pcm_s24daud" \
> --enable-decoder="pcm_s24le" \
> --enable-decoder="pcm_s24le_planar" \
> --enable-decoder="pcm_s32be" \
> --enable-decoder="pcm_s32le" \
> --enable-decoder="pcm_s32le_planar" \
> --enable-decoder="pcm_s8" \
> --enable-decoder="pcm_s8_planar" \
> --enable-decoder="pcm_u16be" \
> --enable-decoder="pcm_u16le" \
> --enable-decoder="pcm_u24be" \
> --enable-decoder="pcm_u24le" \
> --enable-decoder="pcm_u32be" \
> --enable-decoder="pcm_u32le" \
> --enable-decoder="pcm_u8" \
> --enable-decoder="pcm_zork" \
> --enable-decoder="ra_144" \
> --enable-decoder="ra_288" \
> --enable-decoder="ralf" \
> --enable-decoder="vorbis" \
> --enable-decoder="wmav1" \
> --enable-decoder="wmav2" \
> --enable-decoder="wmavoice" 

Re: [FFmpeg-user] FFMPEG to Tangberg video conferencing

2015-11-18 Thread Davood Falahati
Moritz,

Thank you for your reply. As I can see in device's web-page panel, here are
the specifications about the device is attached to this mail. Device is
called TANDBERG, sorry for the typo. It is a video conferencing device with
complete sets of I/O peripherals. I need to connect to it with ffmpeg and
see its content as well.


Thank you,
Davood

Davood Falahati,

Isfahan University of Technology.
d.falahati.1...@ieee.org

Every 3000 Sheets of paper costs us a tree.. Save trees... Conserve
Trees. Don't print this email or any files unless you really need to!

On Wed, Nov 18, 2015 at 12:38 AM, Moritz Barsnick <barsn...@gmx.net> wrote:

> Hi Davood,
>
> On Tue, Nov 17, 2015 at 14:48:47 +0330, Davood Falahati wrote:
>
> > It occurred to me that I have a Tangberg video conferencing device and I
> > want to connect to it with ffmpeg. All I know is it invokes h,323 on port
> > 1720. Please tell me how.
>
> You need to be more specific, or fetch more information from the
> documentation. How would users *not* using ffmpeg access the device?
>
> Possibly with something like "http://:1720",
> which you could pass as input to ffmpeg. But how are we to know. Did
> you happen to mention what the device is exactly called?
>
> Moritz
> ___
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


[FFmpeg-user] FFMPEG to Tangberg video conferencing

2015-11-17 Thread Davood Falahati
Dear Community,

It occurred to me that I have a Tangberg video conferencing device and I
want to connect to it with ffmpeg. All I know is it invokes h,323 on port
1720. Please tell me how.

B.R.
Davood Falahati,

Isfahan University of Technology.
d.falahati.1...@ieee.org
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


[FFmpeg-user] FFMPEG in Android

2015-09-13 Thread Davood Falahati
Dear All,

I wish to embed the precious ffmpeg in my android application. I found some
documents over the Internet even though there were not responding to my
needs which are listed as below:

Firstly, I chiefly develop in Windows and the documents I read require to
compile ffmpeg to empower android platform to get along with. The question
is here: Is there any compiled binaries and/or libraries to waive the
compilation? If so, please give me a hint of.

Secondly, I am currently on Android Studio(AS) and Gradle How should I
configure AS?

Thank you in advance.

Regards,
Davood
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


Re: [FFmpeg-user] hls_size

2015-03-20 Thread Davood Falahati
Thank you,

I managed to do that finally. What is your recommendation for
segment_length and list_size for a low-latency live broadcasting?

Davood Falahati,

PhD candidate in Isfahan University of Technology.
d.falahati.1...@ieee.org


On Fri, Mar 20, 2015 at 8:22 AM, Bogdan ioan Gabor 
bogdanioan.gabor-at-yahoo@ffmpeg.org wrote:

 Sorry, I meant g = segment_length * framerate



  On Friday, March 20, 2015 5:21 PM, Bogdan ioan Gabor 
 bogdanioan.ga...@yahoo.com wrote:


  Hi,
 I had the same issue, and I managed to fix it by setting the framerate (-r
 parameter) and the GOP (-g parameter) which I think it depends on the
 framerate.So, for 3 seconds segments, I've used a framerate of 10 and a GOP
 of 30. I thought it's good/normal to have an I-Frame on every segment.My
 advise will be to just make sure that you have at least one I-Frame for
 each segment.
 g = segment_length + framerate.
  Good luck.




  On Saturday, March 14, 2015 9:16 PM, Davood Falahati 
 d.falahati.1...@ieee.org wrote:


  I want to stream my capture card using HLS. I get the input well and
 everything seems to bo fine. The problem is the latency. I want to lessen
 the chunk sizes to 3 seconds or less, and put less than 5 chunks in every
 list. But when I run the ffmpeg as below:

 ffmpeg -re -i rtmp://serveradress/live/channel -codec copy -map 0 -f
 segment -segment_list playlist.m3u8 -delete -segment_list_flags +live
 -segment_time 3 out%03d.ts

 the generated segments are much longer than 3 seconds. I think the problem
 is with iFrames. How should I fix that issue?
 ___
 ffmpeg-user mailing list
 ffmpeg-user@ffmpeg.org
 http://ffmpeg.org/mailman/listinfo/ffmpeg-user





 ___
 ffmpeg-user mailing list
 ffmpeg-user@ffmpeg.org
 http://ffmpeg.org/mailman/listinfo/ffmpeg-user

___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user


[FFmpeg-user] hls_size

2015-03-14 Thread Davood Falahati
I want to stream my capture card using HLS. I get the input well and
everything seems to bo fine. The problem is the latency. I want to lessen
the chunk sizes to 3 seconds or less, and put less than 5 chunks in every
list. But when I run the ffmpeg as below:

ffmpeg -re -i rtmp://serveradress/live/channel -codec copy -map 0 -f
segment -segment_list playlist.m3u8 -delete -segment_list_flags +live
-segment_time 3 out%03d.ts

the generated segments are much longer than 3 seconds. I think the problem
is with iFrames. How should I fix that issue?
___
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-user