xiaoxiang781216 commented on PR #12829:
URL: https://github.com/apache/nuttx/pull/12829#issuecomment-2271788789

   > @acassis @xiaoxiang781216 I need your help again. My understanding is that 
:
   > 
   > * the **fb.c** holds the framebuffer character driver, can be used to 
interact with displays, or with a image.
   
   Yes, fb is the output side, you can write a fb driver and work with high 
level graphic stack without change, e.g.:
   https://github.com/apache/nuttx-apps/tree/master/graphics/nxwm
   https://github.com/apache/nuttx-apps/tree/master/graphics/lvgl
   https://github.com/apache/nuttx-apps/tree/master/graphics/twm4nx
   v4l2 is the input side (e.g. camera) or both side (e.g. hardware 
encoder/decoder).
   so, fb and v4l2 are different driver framework, but both are compatible with 
Linux counterpart from userspace perspective.
   
   > * I would need to make my sensor "compatible" with v4l2 (interaction with 
the my senzor should be done through v4l2)
   >   Now the things that I do not understand:
   > * That is goldfish_*.c, my guess is that is is an driver used with qemu 
(pipe or brige or something),
   
   goldfish camera 
driver(https://github.com/apache/nuttx/blob/master/drivers/video/goldfish_camera.c)
 is a pseudo device come from goldfish(qemu) emulator (used by Android), which 
could acquire the image from host camera through pipe.
   
   > * I see that isx012 and isx019 camera drivers both uses imgdata.h and 
imgsensor.h
   > 
   
   these two are real image sensor with I2C for control and CSI(I guess) for 
data.
   
   > My main issue, is that I'm not sure about the flow of data from sensor -> 
driver code -> v4l2 -> userspace. What is the link between them. Now I've read 
anything I could about this topics but there is not too much documentation on 
nuttx side. Could any or you help me with a short description of how the date 
should get from sensor to userspace?
   
   imgsensor driver(provided by sensor vendor) is for the real sensor with I2C 
and CSI interface, imgdata driver(provided SoC vendorr) is for SoC side which 
have CSI master to drive sensor. In this case:
   
   1. Most setting (e.g. set exposure) is done by imgsensor driver through I2C 
interface
   2. Data is done by imgdata driver which enable DMA to fetch image through CSI
   3. Some control (e.g. start/stop) handle by both imgsensor/imgdata driver
   
   imgsensor and imgdata form a complete video device(/dev/videox).
   
   If your sensor uses one SPI for both control and data, you normally 
implement imgsensor and imgdata in one file just like goldfish camera driver. 
So, all control/data is handled by one code.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to