Hi All,

I've been experimenting with embedded linux and matrixed LED displays.  I
started on a Raspberry pi user space program, but could see visual
artifacts on the display due to inconsistent timing of my sleep command.
So, I figured that moving the basic row scanning into the kernel would help
out.  After failing to get the right kernel headers for the pi, I switched
to a BeagleBone White.  I've now got a working character device LKM which
takes new images by writing ASCII formatted hex strings to the device in
/dev.  The performance is pretty good, but not great.  I still see visible
artifacts, but am playing with things.

My basic question is this: I know that Linux is not a RTOS, so timing will
never be guaranteed, yet linux does a lot of things very quickly (video,
audio, i2c, etc).  My driver is bit-banging a spi like stream over 8 rows
at a rate of ~3ms per row (333Hz row scanning or ~41Hz per complete frame)
and is really struggling.  How does linux usually get large smooth video at
over 60FPS while doing other things???  Is it simply taking advantage of
special hardware resources?

The obvious solution for this display is to use a little 8051 or M0
microcontroller (or PRU!) and let the Bone feed it over uart or something,
but I really thought I could do better with the LKM.

Am I just doing something totally wrong?  Any other ideas?

Thanks!

--David

-- 
For more options, visit http://beagleboard.org/discuss
--- 
You received this message because you are subscribed to the Google Groups 
"BeagleBoard" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to