On Tue, Jan 11, 2011 at 11:50:05PM +0800, 梁亮 wrote:
> Hi,
> 
> Sorry for any confuse.  My 1st question is:
> from code red_send_stream_data(), it doesn't support SPICE_BITMAP_FMT_RGBA. 
> My understanding is that this should also be one kind of 32bit BITMAP: RGB 
> plus Alpha, each for 8 bits. For such unsupported format, current method is 
> to use usual compression algorithm instead of MJPEG. Why not transfer them to 
> 24bit bitmap at first and use MJPEG?
> 

Sounds right.

> The reason for me to ask this question is that: I setup one Guest OS (Windows 
> XP) using qemu+kvm with spice and qxl enabled (qxl driver is also installed 
> in Guest OS). The Guest OS desktop uses 32-bit color-depth. When I play some 
> video in Guest OS, I noticed "unsupported format 9" log on the screen.
> 

Thanks, sounds like a bug. Would be great if you supplied information on 
recreating this, or better yet filed a bug report, or better better yet wrote a 
patch :) Bugzilla is at bugzilla.redhat.com (I think there is one setup on 
freedesktop but I'm not sure if we use it for spice).

> Thank you.
> 
> Thanks,
> Liang
> 
> 
> At 2011-01-10 22:24:15,"Alon Levy" <al...@redhat.com> wrote:
> 
> >On Mon, Jan 10, 2011 at 09:20:39PM +0800, 梁亮 wrote:
> >> Hi,
> >> 
> >> Thanks for your kind information.  Yes, recently I digged the spice 
> >> implementation and found some clues like what you mentioned :-)  I still 
> >> have some questions regarding stream data:
> >> 1) from version 0.63 source code, it seems that 32bit bitmap format is not 
> >> supported to use MJPEG, why?
> >
> >I don't understand the question - MJPEG isn't a bitmap format.
> >
> >> 2) for 32bit bitmap(format 9), there are two methods to process it 
> >> depending on JPEG compression flag.  If bandwidth is ok, it will use high 
> >> quality delivery. Otherwise it will use some lossy-compression algorithm. 
> >> So what's the difference between the lossy-compression and MJPEG?
> >
> >MJPEG is for streaming. Basically, any rendering operation has two paths 
> >when the server reads it from the driver outgoing queue:
> > * should it be streamed?
> >   * yes:
> >    * is there an existing stream?
> >     * no: create one. if client exists: send creation to client. create 
> > mjpeg encoder
> >    * push to mjpeg encoder, take output and send to client if attached.
> >   * no:
> >    * send as usual: apply compression flags and heuristics to choose best, 
> > compress and
> >     send as a draw operation
> >
> >> 3) I'm really puzzled by the spice marshaller code... it seems that they 
> >> are generated by some python modules/scripts, could anyone share something 
> >> about it and the purpose?
> >> 
> >
> >Well, it's there to allow us to support both the old and the new protocols. 
> >It actually has a nice benefit that you can read the whole protocol by 
> >looking at spice.proto (new) and spice1.proto (old). Both in the root 
> >directory.
> >
> >> Have a nice day!
> >> 
> >> Thanks,
> >> Liang
> >> 
> >> 
> >> 
> >> At 2011-01-10 18:16:30,"Alon Levy" <al...@redhat.com> wrote:
> >> 
> >> >On Fri, Dec 17, 2010 at 09:51:39PM +0800, 梁亮 wrote:
> >> >> Dear all,
> >> >> 
> >> >> Several days ago I tried using spice and really impressed by its  
> >> >> performance, really good job! Then I tried to dig the details, mainly 
> >> >> focus on graphics subsystem and learned a lot. Thank you!
> >> >> 
> >> >> During studying the handling for stream data, I'm lost in the 
> >> >> interaction of qemu display driver and spice server. So I send this 
> >> >> mail to consult you experts some simple questions  :-)
> >> >> 1. For the video played in Guest OS, my understanding is that media 
> >> >> player (for example, mplayer) in Guest OS will decode the video file 
> >> >> and the display driver(qxl driver) will process decoded frames. Is it 
> >> >> true? Then display driver will store frame data into stream chain of 
> >> >> RedWorker? Each frame will be stored or just some key frames? For above 
> >> >> descriptions, they are just my assumption, I have not found provens 
> >> >> from the code. Help you could share some hints about the detail.
> >> >
> >> >The frames are fed one by one to the mjpeg encoder on the server, and its 
> >> >output is sent to the client if one is connected. Look in red_worker.c, 
> >> >track mjpeg_encoder and streams (streams belongs to RedWorker).
> >> >
> >> >> 2. Each stream data will be encoded using MJpeg on server side and 
> >> >> decoded on client side. Have the team considered higher compression 
> >> >> rate codec, like mpeg-4?  If yes, could you help share the reason why 
> >> >> it's not adopted? My understanding is that video application is the 
> >> >> killer application and consume much bandwidth for remote display system.
> >> >
> >> >We should definitely try adjusting the codec and it's parameters based on 
> >> >bandwidth/latency and cpu requirements.
> >> >
> >> >> 
> >> >> Thank you and have a nice weekend!
> >> >> 
> >> >> Thanks,
> >> >> Liang
> >> >
> >> >> _______________________________________________
> >> >> Spice-devel mailing list
> >> >> Spice-devel@lists.freedesktop.org
> >> >> http://lists.freedesktop.org/mailman/listinfo/spice-devel
> >> >
_______________________________________________
Spice-devel mailing list
Spice-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/spice-devel

Reply via email to