Hello Andrey,

thanks for your help. The issue with the trigger is currently on hold and I concentrate on reading out the raw sensor data from the video memory.

One of the drivers provide raw access to the whole video memory as to the large continuous  file. Other driver provides access to the actual captured frame data.
Do you mean the x393_videomem.c driver? It seems this driver does not have the functionality implemented (https://git.elphel.com/Elphel/linux-elphel/blob/master/src/drivers/elphel/x393_videomem.c#L417).

In memory the frame width is rounded up, so there are gaps between sensor pixel data.
This means that every scanline, independent of the width, is in a 8192 byte region and the next scanline starts at the next 8192 byte boundary. Also the two frames for each sensor are also consecutive in the video memory without any gap, besides the round-up to 8192 byte. Is this correct?


Kind regards,

Fabjan Sukalia


Am 2017-12-15 um 17:57 schrieb Elphel Support:
Hello Fabian,

The sensors used in 393 have 2 major operational modes - free running and triggered (there are mode details in https://blog.elphel.com/2016/10/using-a-flash-with-a-cmos-image-sensor-ers-and-grr-modes/ and in sensor datasheets). In free running mode the maximal frame rate does not depend on exposure time (exposure can be up to the full frame period). In the triggered mode (from the sensor "point of view", so it does not matter if the trigger is received over the cable or generated by the FPGA timer) exposure and readout can not be overlapped, so the maximal frame rate is limited to 1/(T_readout + T_exposure). That means that the trigger can be missed if exposure is set too high (for example by the autoexposure daemon). Please describe what trigger problems did you have so we can try to reproduce them.

If your exposure time  is short compared to readout time, you just need to slightly increase the frame period (so it will accommodate both T_readout and T_exposure) and either use manual exposure or specify maximal exposure time in autoexposure settings.

If your exposure time is high (not enough light) it is possible to try the following trick.
1) Run camera in triggered mode (FPS < 1/(T_readout+T_exposure)
2) Make sure the parameters that define the frame rate in free running mode are the same for all the participating sensors. 3) Limit or set exposure time so it will never exceed frame period in free running mode 4) Simultaneously (using broadcast mask) switch all sensors to the free running mode

Sensors should stay in sync as they use the same source clock and all other parameters are the same.

As for uncompressed data - it should be possible (it is tested with Python test_mcntrl.py ) as there is DMA-based bridge between the video memory and the system memory. There are drivers ported from the 353 camera that provide access to this memory, but we did not use them and need to check operation. One of the drivers provide raw access to the whole video memory as to the large continuous  file. Other driver provides access to the actual captured frame data. In memory the frame width is rounded up, so there are gaps between sensor pixel data.

Next thing depends on 8/16 bpp modes. In normal JPEG/JP4 modes the data in the video memory is 8bpp (after the gamma conversion), and so it is possible to simultaneously get both compressed and uncompressed output. In 16 bpp mode (with 12 bit sensor data is shifted left by 3 bits, so different sensors use full range of positive short int). In that mode it is not possible to simultaneously get compressed and raw data.

Video memory buffering can be programmed to use variable number of frames for each channel, by default it is set to 2, working as a Ping-pong buffer. When using compressed output the operation of the data acquisition channel (writing video memory in scan-line order) and reading data to compressors (20x20 overlapping tiles in JPEG mode, non-overlapping 16x16 in JP4 mode) are synchronized in the FPGA (read channel waits for the sufficient lines to be acquired for the next row of tiles), but that is not so for the raw data read from the video memory. FPGA provides 8 individual interrupts for the imaging subsystem - 4 channels for the sensor acquisition channels (frame sync signals also internally advance command sequencers described here - https://blog.elphel.com/2016/09/nc393-development-progress-and-the-future-plans/) and 4 compressor_done interrupts. And there are userland ways to wait fro the next frame (e.g. from the PHP extension - https://wiki.elphel.com/wiki/PHP_in_Elphel_cameras).

We will check (update if needed) the drivers that provide access to the video memory.

Andrey




---- On Fri, 15 Dec 2017 05:48:27 -0800 *Fabjan Sukalia <fabjan.suka...@qinematiq.com>* wrote ----

    Dear Elphel-Team,

    currently I'm working with the synchronization and readout of the
    sensor on the 393. My first goal is to synchronize two or more
    sensors so that the pictures are taken at the exact same time. To
    my understanding the internal trigger could be used for this
    purpose but I'm unsure if this produces a stable video stream with
    the maximum frame rate. Currently I'm unable to confirm this as
    the firmware that is provided by my colleagues has issues with the
    trigger. Therefore I'm asking you if the internal trigger can
    synchronize the sensors and still provide a video stream with the
    highest frame rate possible. Also the maximal exposure time would
    be 16 ms for a 60 fps video.

    My second task is to access the uncompressed data of the sensors.
    These data reside on the memory chip dedicated to the FPGA-part of
    the Zynq. Is there some example code on how to access the
    uncompressed data from a user-space program?

    Thanks in advance.

    Kind regards,

    Fabjan Sukalia

-- qinematiq GmbH
    ---------------------------------------------------------
    Fabjan Sukalia        Millergasse 21/5      A - 1060 Vienna
    Tel: +43 1 595 11 21-11           Mobil: +43 664 926 9277
    www.qinematiq.com <http://www.qinematiq.com>

    _______________________________________________
    Support-list mailing list
    Support-list@support.elphel.com
    <mailto:Support-list@support.elphel.com>
    http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com






--
qinematiq GmbH
---------------------------------------------------------
Fabjan Sukalia       Millergasse 21/5       A-1060 Vienna
www.qinematiq.com

_______________________________________________
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com

Reply via email to