Re: [Farsight-devel] Can't send a video with Farsight/Telepathy (1/2)

2010-04-19 Thread Fabien LOUIS
Hi,

Thanks for your response.

For my test, I send camera in SLAVE console and a video file (with
uridecodebin) in MASTER console.
So, MASTER receive the slave's cam, but SLAVE receive nothing (no
'src_pad_added' signal)

The latest lines in MASTER were:
  === <__main__.TfListener object at 0x948354c> __src_pad_added ===
  (test_farsight.py:4914): tp-fs-DEBUG: stream 1 0x92e79f0 (video)
set_remote_codecs: called
  (test_farsight.py:4914): tp-fs-DEBUG: stream 1 0x92e79f0 (video)
set_remote_codecs: adding remote codec H264 [97]
  (test_farsight.py:4914): tp-fs-DEBUG: stream 1 0x92e79f0 (video)
set_remote_codecs: adding remote codec THEORA [96]
  (test_farsight.py:4914): tp-fs-DEBUG: stream 1 0x92e79f0 (video)
set_remote_codecs: adding remote codec H263 [34]
  (test_farsight.py:4914): tp-fs-DEBUG: stream 1 0x92e79f0 (video)
set_remote_codecs: adding remote codec JPEG [26]
  (test_farsight.py:4914): tp-fs-DEBUG: stream 1 0x92e79f0 (video)
set_remote_codecs: adding remote codec MPV [32]
  (test_farsight.py:4914): tp-fs-DEBUG: stream 1 0x92e79f0 (video)
set_remote_codecs: adding remote codec H263-1998 [98]
  (test_farsight.py:4914): tp-fs-DEBUG: stream 1 0x92e79f0 (video)
_tf_stream_try_sending_codecs: called (send_local:0 send_supported:1)
  (test_farsight.py:4914): tp-fs-DEBUG: stream 1 0x92e79f0 (video)
_tf_stream_try_sending_codecs: Ignoring new codecs because we're
sending, but we're not ready

I test also with a queue at the end of my source pipeline, no changes.
For my pipeline's creation, I use "gst.parse_bin_from_description".
I lookup your example for aMSN's farsight, but I don't quite understand it.

If you have any ideas,
Fabien


2010/4/15 Youness Alaoui :
> Hi,
>
> In your email (and the log), it doesn't say anything about why or *how* it
> doesn't work ? Can you explain in more details what exactly happens ? do you 
> get
> a gstreamer error ? or does the pipeline freeze? or the other side doesn't see
> anything? or the other side only sees noise.. etc...
>
> Also, although it probably doesn't matter, I usually put a queue at the end of
> the source pipeline when I use a decodebin, but I can't remember why I used to
> do that, maybe it helps prevent some issues like yours...
> Also, if you create the bin using that pipeline string and gst_parse_launch,
> then you should do this :
>    GST_OBJECT_FLAG_UNSET (bin, GST_ELEMENT_IS_SINK);
> otherwise it won't work. Because gst_parse_launch thinks the pipeline will be
> self sustained and will set the flags SRC and SINK on it, which will cause
> issues with the other elements in your farsight pipeline..
>
> See the aMSN's farsight code for how I did it (and tested it to work with
> scenarios like yours).
>
> http://amsn.svn.sourceforge.net/viewvc/amsn/trunk/amsn/utils/farsight/src/tcl_farsight.c?revision=11846&view=markup
> see from line 799 to 835.
>
> Youness.
>
>
> Fabien LOUIS wrote:
>> Hi all,
>>
>> Currently I have a test program which works with Telepathy and
>> Farsight. Its goal is to connect two clients (slave who listen and
>> master who initiate the call) and exchange video stream.
>> I can send "videotestsrc", the user's webcam with "gconfvideosrc" and
>> sound with "audiotestsrc".
>>
>> However, I am not able to send a video file with "uridecodebin". Why
>> do I need to do more changes in my code than changing the gstreamer
>> source line? Sending a video file or "videotestsrc" should be similar,
>> no ?
>> Here is the bin description I use: "uridecodebin
>> uri=file:///home/fabien/Bureau/Video/test.avi ! identity sync=True !
>> ffmpegcolorspace ! videoscale ! video/x-raw-yuv,width=320,height=240
>> ! ffmpegcolorspace ! timeoverlay"
>>
>> If anyone could explain me why it doesn't works and how can I fix my
>> problem, I would be very happy.
>> I attach my test file and the logs generated by the console.
>>
>> I run the first client (slave) with "reset ; killall telepathy-gabble
>> ; python ./test_farsight.py slave --autoconnect"
>> and then the second client (master) with "reset && python
>> test_farsight.py master --video".
>>
>> If you need more details, please ask me.
>>
>> Thank you very much,
>> Fabien
>>
>> PS: I will send debug files in a second mail.
>>
>>
>> 
>>
>> --
>> Download Intel® Parallel Studio Eval
>> Try the new software tools for yourself. Speed compiling, find bugs
>> proactively, and fine-tune applications for parallel performance.
>> See why Intel Parallel Studio got high marks during beta.
>> http://p.sf.net/sfu/intel-sw-dev
>>
>>
>> 
>>
>> ___
>> Farsight-devel mailing list
>> Farsight-devel@lists.sourceforge.net
>> https://lists.sourceforge.net/lists/listinfo/farsight-devel
>
>
>

--

Re: [Farsight-devel] Can't send a video with Farsight/Telepathy (1/2)

2010-04-15 Thread Youness Alaoui
Hi,

In your email (and the log), it doesn't say anything about why or *how* it
doesn't work ? Can you explain in more details what exactly happens ? do you get
a gstreamer error ? or does the pipeline freeze? or the other side doesn't see
anything? or the other side only sees noise.. etc...

Also, although it probably doesn't matter, I usually put a queue at the end of
the source pipeline when I use a decodebin, but I can't remember why I used to
do that, maybe it helps prevent some issues like yours...
Also, if you create the bin using that pipeline string and gst_parse_launch,
then you should do this :
GST_OBJECT_FLAG_UNSET (bin, GST_ELEMENT_IS_SINK);
otherwise it won't work. Because gst_parse_launch thinks the pipeline will be
self sustained and will set the flags SRC and SINK on it, which will cause
issues with the other elements in your farsight pipeline..

See the aMSN's farsight code for how I did it (and tested it to work with
scenarios like yours).

http://amsn.svn.sourceforge.net/viewvc/amsn/trunk/amsn/utils/farsight/src/tcl_farsight.c?revision=11846&view=markup
see from line 799 to 835.

Youness.


Fabien LOUIS wrote:
> Hi all,
> 
> Currently I have a test program which works with Telepathy and
> Farsight. Its goal is to connect two clients (slave who listen and
> master who initiate the call) and exchange video stream.
> I can send "videotestsrc", the user's webcam with "gconfvideosrc" and
> sound with "audiotestsrc".
> 
> However, I am not able to send a video file with "uridecodebin". Why
> do I need to do more changes in my code than changing the gstreamer
> source line? Sending a video file or "videotestsrc" should be similar,
> no ?
> Here is the bin description I use: "uridecodebin
> uri=file:///home/fabien/Bureau/Video/test.avi ! identity sync=True !
> ffmpegcolorspace ! videoscale ! video/x-raw-yuv,width=320,height=240
> ! ffmpegcolorspace ! timeoverlay"
> 
> If anyone could explain me why it doesn't works and how can I fix my
> problem, I would be very happy.
> I attach my test file and the logs generated by the console.
> 
> I run the first client (slave) with "reset ; killall telepathy-gabble
> ; python ./test_farsight.py slave --autoconnect"
> and then the second client (master) with "reset && python
> test_farsight.py master --video".
> 
> If you need more details, please ask me.
> 
> Thank you very much,
> Fabien
> 
> PS: I will send debug files in a second mail.
> 
> 
> 
> 
> --
> Download Intel® Parallel Studio Eval
> Try the new software tools for yourself. Speed compiling, find bugs
> proactively, and fine-tune applications for parallel performance.
> See why Intel Parallel Studio got high marks during beta.
> http://p.sf.net/sfu/intel-sw-dev
> 
> 
> 
> 
> ___
> Farsight-devel mailing list
> Farsight-devel@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/farsight-devel




signature.asc
Description: OpenPGP digital signature
--
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev___
Farsight-devel mailing list
Farsight-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/farsight-devel