Re: [Elphel-support] Questions about Elphel 323 Sensor Board

2020-04-17 Thread Nick Duvoisin
Hi Andrey,

I have one board - the sensor board (10324), which contains both the analog
and FPGA components.  This seems to be different than the configuration for
the 363 sensor board (10342), which just had the analog components (there
was a separate CCD control board 10347).

According to your website, the 323 camera consisted of: 1 - 4 of 10313
CPU/compressor boards (10313), one sensor board (10324), and one power
supply board (10325).  I have the sensor board, I can buy the processor
board 10353 from you, and make the power supply board myself (maybe buying
the bare PCB from you).

If I order the 10353, can you install the 363 software on it?  Will that
software be able to operate the 10324 board without modification (I'm
guessing the KAI-11000 and KAI-11002 are very similar)?  Or, can I just use
the image provided on SourceForge? (
https://sourceforge.net/projects/elphel/files/elphel353-8/8.2.16/)

My other question is about the FPGA on the sensor board (Xilinx XC2S300E).
Will it be stable after all these years?  Do you have a copy of the image
used to program it?  I wondering just in case there is a problem with it.

Thanks again for all the advice!
Nick






On Thu, Apr 16, 2020 at 10:02 AM Elphel Support <
support-list@support.elphel.com> wrote:

>
> Nick,
>
> We may have the power supply bare PCB (no components), and it is designed
> specifically for the 323 camera, so there is no replacements. If I
> understand correctly you have 2 boards (analog and FPGA)?
>
> I would still recommend to use 10353 (we should have some) and its'
> existing software (for 363 that we have one as a reference) - designing a
> completely new system around NVIDIA would be a really huge job, even to
> make it program FPGA can be somewhat tricky.
>
> Andrey
>
>
>
>  On Thu, 16 Apr 2020 00:39:31 -0600 *Nick Duvoisin
> >* wrote 
>
> Hi Andrey,
>
> Thanks for the quick response!  It looks like it will require a 1D20325
> power supply board in addition to the 10353 processor board.  Do you still
> have any of those available?  If not, are there any other power supply
> boards that could be used in lieu of the 1D20325?
>
> Also, is there any documentation on how to interface with these sensor
> boards?  It would be interesting to try and use an NVIDIA Jetson Nano (
> https://developer.nvidia.com/embedded/jetson-nano-developer-kit) to drive
> the sensor board.  I know that replacing FPGAs with GPUs is a trend for
> certain imaging applications.
>
> Nick
>
> On Wed, Apr 15, 2020 at 12:07 AM Elphel Support <
> support-list@support.elphel.com> wrote:
>
> ___
> Support-list mailing list
> Support-list@support.elphel.com
> http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com
>
>
> Hi Nick,
>
> I believe it also needs other boards - I need to check, these boards were
> made before we started wiki.elphel.com and put there documentation. These
> boards docs are here:
>
> http://legacy.elphel.com/3fhlo/
>
> The 323 cameras used older 10313 boards (2 of them as the bandwidth of the
> older ETRAX processor was insufficient to provide required 1/1.5s frame
> rate.
> It is possible to build model 363 camera that uses 1 10353 instead of the
> 2 10313 - https://wiki.elphel.com/wiki/353_legacy
> We do have 10353 boards, but not the other ones. And have one 363 camera
> so we can probably check what software is there
>
> 323 cameras were operated with JP4 encoding (
> https://community.elphel.com/jp4/jp4demo.php) that was developed
> specifically for that purpose - no de-Bayer in the camera. The same was
> used in the Street View R5 (
> https://en.wikipedia.org/wiki/Google_Street_View), it is shown on the
> picture there called "A Google Street View trike" - large black octagon
> that provided first high-res imagery.
> And we still use JP4 this format in all our current cameras. Internally it
> uses the same JPEG engine that we implemented in the FPGA, just reorders
> pixels. The JPEG quality can be set to any value, including 100%
>
> Andrey
>
>
>  On Wed, 15 Apr 2020 00:42:28 -0600 *Nick Duvoisin
> >* wrote 
>
> Hi,
>
> I recently ordered a sensor board from the Elphel 323 camera off Ebay and
> had a few questions about getting it up and running. The board has the
> Kodak KAI11000 CCD sensor and has model number 1A20324 Rev "A".  I'm a
> software engineer, so you can be technical in your responses.
>
>
>1. This particular sensor board contains both the FPGA-based
>timing/interface module and the analog sensor front-end, correct? (I can
>see a Xilinx Spartan FPGA on it)
>
>2. From a hardware standpoint, all I need to create a functioning
>camera is a 10353 processor board, power supply, and injector cable,
>correct? (besides a housing, lens mount, and lens)
>
>3. Do you still sell the 10353 processor board, or will I have to have
>one made from the Gerber file and parts list?
>
>4. For the software, it looks like the x353 

Re: [Elphel-support] Questions about Elphel 323 Sensor Board

2020-04-16 Thread Elphel Support
Nick,

We may have the power supply bare PCB (no components), and it is designed 
specifically for the 323 camera, so there is no replacements. If I understand 
correctly you have 2 boards (analog and FPGA)?

I would still recommend to use 10353 (we should have some) and its' existing 
software (for 363 that we have one as a reference) - designing a completely new 
system around NVIDIA would be a really huge job, even to make it program FPGA 
can be somewhat tricky. 

Andrey



 On Thu, 16 Apr 2020 00:39:31 -0600 Nick Duvoisin 
 wrote 


Hi Andrey,



Thanks for the quick response!  It looks like it will require a 1D20325 power 
supply board in addition to the 10353 processor board.  Do you still have any 
of those available?  If not, are there any other power supply boards that could 
be used in lieu of the 
1D20325?



Also, is there any documentation on how to interface with these sensor boards?  
It would be interesting to try and use an NVIDIA Jetson Nano 
(https://developer.nvidia.com/embedded/jetson-nano-developer-kit) to drive the 
sensor board.  I know that replacing FPGAs with GPUs is a trend for certain 
imaging applications.



Nick



On Wed, Apr 15, 2020 at 12:07 AM Elphel Support 
 wrote:




___
Support-list mailing list 
mailto:Support-list@support.elphel.com 
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com 



Hi Nick,

I believe it also needs other boards - I need to check, these boards were made 
before we started http://wiki.elphel.com and put there documentation. These 
boards docs are here:

http://legacy.elphel.com/3fhlo/
The 323 cameras used older 10313 boards (2 of them as the bandwidth of the 
older ETRAX processor was insufficient to provide required 1/1.5s frame rate.
It is possible to build model 363 camera that uses 1 10353 instead of the 2 
10313 - https://wiki.elphel.com/wiki/353_legacy
We do have 10353 boards, but not the other ones. And have one 363 camera so we 
can probably check what software is there

323 cameras were operated with JP4 encoding 
(https://community.elphel.com/jp4/jp4demo.php) that was developed specifically 
for that purpose - no de-Bayer in the camera. The same was used in the Street 
View R5 (https://en.wikipedia.org/wiki/Google_Street_View), it is shown on the 
picture there called "A Google Street View trike" - large black octagon that 
provided first high-res imagery.
 And we still use JP4 this format in all our current cameras. Internally it 
uses the same JPEG engine that we implemented in the FPGA, just reorders 
pixels. The JPEG quality can be set to any value, including 100%

Andrey


 On Wed, 15 Apr 2020 00:42:28 -0600 Nick Duvoisin 
 wrote 


Hi,



I recently ordered a sensor board from the Elphel 323 camera off Ebay and had a 
few questions about getting it up and running. The board has the Kodak KAI11000 
CCD sensor and has model number 1A20324 Rev "A".  I'm a software engineer, so 
you can be technical in your responses.



This particular sensor board contains both the FPGA-based timing/interface 
module and the analog sensor front-end, correct? (I can see a Xilinx Spartan 
FPGA on it)


>From a hardware standpoint, all I need to create a functioning camera is a 
>10353 processor board, power supply, and injector cable, correct? (besides a 
>housing, lens mount, and lens)


Do you still sell the 10353 processor board, or will I have to have one made 
from the Gerber file and parts list?


For the software, it looks like the x353 repository on GitHub contains the 
Verilog to create the BIT file for the FPGA.  But where can I find the 
Linux-based webserver software that allows me to control the camera over 
Ethernet?  It looks like the linux-elphel repository contains this software for 
the 10393 board only.



My ultimate goal is to create a camera based on a monochrome full-frame CCD 
sensor that does not apply a demosaicing  algorithm to the raw pixel data.  
Much like a Leica M Monochrom, but with a Canon EF lens mount.  I realize this 
will take a long time to complete, but I'm looking forward to the challenge and 
learning more about how image sensors work!



Thanks,

Nick



___
Support-list mailing list 
mailto:Support-list@support.elphel.com 
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] Questions about Elphel 323 Sensor Board

2020-04-16 Thread Nick Duvoisin
Hi Andrey,

Thanks for the quick response!  It looks like it will require a 1D20325
power supply board in addition to the 10353 processor board.  Do you still
have any of those available?  If not, are there any other power supply
boards that could be used in lieu of the 1D20325?

Also, is there any documentation on how to interface with these sensor
boards?  It would be interesting to try and use an NVIDIA Jetson Nano (
https://developer.nvidia.com/embedded/jetson-nano-developer-kit) to drive
the sensor board.  I know that replacing FPGAs with GPUs is a trend for
certain imaging applications.

Nick

On Wed, Apr 15, 2020 at 12:07 AM Elphel Support <
support-list@support.elphel.com> wrote:

>
> Hi Nick,
>
> I believe it also needs other boards - I need to check, these boards were
> made before we started wiki.elphel.com and put there documentation. These
> boards docs are here:
>
> http://legacy.elphel.com/3fhlo/
>
> The 323 cameras used older 10313 boards (2 of them as the bandwidth of the
> older ETRAX processor was insufficient to provide required 1/1.5s frame
> rate.
> It is possible to build model 363 camera that uses 1 10353 instead of the
> 2 10313 - https://wiki.elphel.com/wiki/353_legacy
> We do have 10353 boards, but not the other ones. And have one 363 camera
> so we can probably check what software is there
>
> 323 cameras were operated with JP4 encoding (
> https://community.elphel.com/jp4/jp4demo.php) that was developed
> specifically for that purpose - no de-Bayer in the camera. The same was
> used in the Street View R5 (
> https://en.wikipedia.org/wiki/Google_Street_View), it is shown on the
> picture there called "A Google Street View trike" - large black octagon
> that provided first high-res imagery.
> And we still use JP4 this format in all our current cameras. Internally it
> uses the same JPEG engine that we implemented in the FPGA, just reorders
> pixels. The JPEG quality can be set to any value, including 100%
>
> Andrey
>
>
>  On Wed, 15 Apr 2020 00:42:28 -0600 *Nick Duvoisin
> >* wrote 
>
> Hi,
>
> I recently ordered a sensor board from the Elphel 323 camera off Ebay and
> had a few questions about getting it up and running. The board has the
> Kodak KAI11000 CCD sensor and has model number 1A20324 Rev "A".  I'm a
> software engineer, so you can be technical in your responses.
>
>
>1. This particular sensor board contains both the FPGA-based
>timing/interface module and the analog sensor front-end, correct? (I can
>see a Xilinx Spartan FPGA on it)
>
>2. From a hardware standpoint, all I need to create a functioning
>camera is a 10353 processor board, power supply, and injector cable,
>correct? (besides a housing, lens mount, and lens)
>
>3. Do you still sell the 10353 processor board, or will I have to have
>one made from the Gerber file and parts list?
>
>4. For the software, it looks like the x353 repository on GitHub
>contains the Verilog to create the BIT file for the FPGA.  But where can I
>find the Linux-based webserver software that allows me to control the
>camera over Ethernet?  It looks like the linux-elphel repository contains
>this software for the 10393 board only.
>
>
> My ultimate goal is to create a camera based on a monochrome full-frame
> CCD sensor that does not apply a demosaicing algorithm to the raw pixel
> data.  Much like a Leica M Monochrom, but with a Canon EF lens mount.  I
> realize this will take a long time to complete, but I'm looking forward to
> the challenge and learning more about how image sensors work!
>
> Thanks,
> Nick
> ___
> Support-list mailing list
> Support-list@support.elphel.com
> http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com
>
>
>
>
___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] Questions about Elphel 323 Sensor Board

2020-04-15 Thread Elphel Support
Hi Nick,

I believe it also needs other boards - I need to check, these boards were made 
before we started wiki.elphel.com and put there documentation. These boards 
docs are here:

http://legacy.elphel.com/3fhlo/
The 323 cameras used older 10313 boards (2 of them as the bandwidth of the 
older ETRAX processor was insufficient to provide required 1/1.5s frame rate.
It is possible to build model 363 camera that uses 1 10353 instead of the 2 
10313 - https://wiki.elphel.com/wiki/353_legacy
We do have 10353 boards, but not the other ones. And have one 363 camera so we 
can probably check what software is there

323 cameras were operated with JP4 encoding 
(https://community.elphel.com/jp4/jp4demo.php) that was developed specifically 
for that purpose - no de-Bayer in the camera. The same was used in the Street 
View R5 (https://en.wikipedia.org/wiki/Google_Street_View), it is shown on the 
picture there called "A Google Street View trike" - large black octagon that 
provided first high-res imagery.
 And we still use JP4 this format in all our current cameras. Internally it 
uses the same JPEG engine that we implemented in the FPGA, just reorders 
pixels. The JPEG quality can be set to any value, including 100%

Andrey


 On Wed, 15 Apr 2020 00:42:28 -0600 Nick Duvoisin 
 wrote 


Hi,



I recently ordered a sensor board from the Elphel 323 camera off Ebay and had a 
few questions about getting it up and running. The board has the Kodak KAI11000 
CCD sensor and has model number 1A20324 Rev "A".  I'm a software engineer, so 
you can be technical in your responses.



This particular sensor board contains both the FPGA-based timing/interface 
module and the analog sensor front-end, correct? (I can see a Xilinx Spartan 
FPGA on it)


>From a hardware standpoint, all I need to create a functioning camera is a 
>10353 processor board, power supply, and injector cable, correct? (besides a 
>housing, lens mount, and lens)


Do you still sell the 10353 processor board, or will I have to have one made 
from the Gerber file and parts list?


For the software, it looks like the x353 repository on GitHub contains the 
Verilog to create the BIT file for the FPGA.  But where can I find the 
Linux-based webserver software that allows me to control the camera over 
Ethernet?  It looks like the linux-elphel repository contains this software for 
the 10393 board only.



My ultimate goal is to create a camera based on a monochrome full-frame CCD 
sensor that does not apply a demosaicing  algorithm to the raw pixel data.  
Much like a Leica M Monochrom, but with a Canon EF lens mount.  I realize this 
will take a long time to complete, but I'm looking forward to the challenge and 
learning more about how image sensors work!



Thanks,

Nick



___
Support-list mailing list 
mailto:Support-list@support.elphel.com 
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


[Elphel-support] Questions about Elphel 323 Sensor Board

2020-04-15 Thread Nick Duvoisin
Hi,

I recently ordered a sensor board from the Elphel 323 camera off Ebay and
had a few questions about getting it up and running. The board has the
Kodak KAI11000 CCD sensor and has model number 1A20324 Rev "A".  I'm a
software engineer, so you can be technical in your responses.


   1. This particular sensor board contains both the FPGA-based
   timing/interface module and the analog sensor front-end, correct? (I can
   see a Xilinx Spartan FPGA on it)

   2. From a hardware standpoint, all I need to create a functioning camera
   is a 10353 processor board, power supply, and injector cable, correct?
   (besides a housing, lens mount, and lens)

   3. Do you still sell the 10353 processor board, or will I have to have
   one made from the Gerber file and parts list?

   4. For the software, it looks like the x353 repository on GitHub
   contains the Verilog to create the BIT file for the FPGA.  But where can I
   find the Linux-based webserver software that allows me to control the
   camera over Ethernet?  It looks like the linux-elphel repository contains
   this software for the 10393 board only.


My ultimate goal is to create a camera based on a monochrome full-frame CCD
sensor that does not apply a demosaicing algorithm to the raw pixel data.
Much like a Leica M Monochrom, but with a Canon EF lens mount.  I realize
this will take a long time to complete, but I'm looking forward to the
challenge and learning more about how image sensors work!

Thanks,
Nick
___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] questions

2014-04-28 Thread Alexandre Poltorak
Hi,

The only relationship between exposure time and frame-rate is that the last one 
can be limited by the exposure time. In triggered mode you have to carefully 
set the exposure time (or limit on maximal exposure time in autoexp), in 
free-running mode exposure automatically limit the framerate. (again you can 
set a maximal exposure time)

So maximal exposure time and resolution to get 25 FPS should be 40ms and 
1920x1088.

I tried many different cameras and Elphel NC353L perform not so badly in dark 
environments. 
You can increase exposure to 40ms.
Check that your diaphragm is fully open. 

But if you really want to use the camera in low light solution, I suggest using 
a monochrome sensor, removing the IR-cut filter and using IR compatible lens. 

Best regards,
Alexandre Poltorak
CTO
 
Office: +41 22 341 3300
Mobile: +41 79 696 4225
Email: a.polto...@foxel.ch
Web: www.foxel.ch

FOXEL SA
Chemin de Champ-Claude 10
1214 Vernier, Geneva, Switzerland

- Mail original -
| De: 201图像所 gmail.com
| À: Oleg l...@support.elphel.com
| Envoyé: Lundi 28 Avril 2014 04:03:02
| Objet: Re: [Elphel-support] questions
| 
| Hi,
| 
| First of all,many thanks for your timing reply.
| 
| I set the frame rate (25fps) in the Free run and check it in the
| camera control interface yesterday.
| 
| Today I set the frame rate(25fps) in the TRIG =4,TRIG_CONDITION
| =0,TRIG_PERIOD = 384 ,and check it in the camera control interface
| today,frame rate is 25 fps(green),and I check it in your suggested
| way,it is 25fps, too.
| 
| However,When I set the exposure time to as low as 10ms,the vedio is
| too dark to see in the nature light.Is it that the elphel camera can
| get 25fps in the dark pictures or anything else I must do to adjust ?
| What's the relationship between frame rate and exposure time in the
| elphel camera?
| 
| We hope to receive your favors at early date.Many thanks for your support.
| 
| Rui Song
| 
| ___
| Support-list mailing list
| Support-list@support.elphel.com
| http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com
| 

___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] questions

2014-04-27 Thread 201图像所
Hi,

First of all,many thanks for your timing reply.

I set the frame rate (25fps) in the Free run and check it in the
camera control interface yesterday.

Today I set the frame rate(25fps) in the TRIG =4,TRIG_CONDITION
=0,TRIG_PERIOD = 384 ,and check it in the camera control interface
today,frame rate is 25 fps(green),and I check it in your suggested
way,it is 25fps, too.

However,When I set the exposure time to as low as 10ms,the vedio is
too dark to see in the nature light.Is it that the elphel camera can
get 25fps in the dark pictures or anything else I must do to adjust ?
What's the relationship between frame rate and exposure time in the
elphel camera?

We hope to receive your favors at early date.Many thanks for your support.

Rui Song

___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] questions

2014-04-25 Thread 201图像所
Dear Mrs/Mr,

I am so glad to receive your timing reply.

I  also want to ask a question about frame rate.I got to konw that the
frame rate can get 15fps at full resolution at least from your
web.Frame rate can get higher to 25fps at least when adjust resolution
lower .But I see the frame rate is 2fps at full resolution,and when I
lower the resolution or higher the parameter of frame rate manually,
the frame rate can not change actully(the vedio of camera is not
influent at all).Please tell me the reason.

We hope to receive your favors at early date.Many thanks for your support.

2014-04-25 1:08 GMT+08:00 Oleg support-list@support.elphel.com:
 Hello,

 Ogg Theora was implemented on the previous camera (model 333) and was never
 ported to the current one. In 353 camera we use just Motion JPEG (sequence
 of JPEG frames), either with standard JPEG or the modified version - JP4,
 that is designed to compress raw Bayer pixels without conversion to YCbCr
 4:2:0

 Best regards,
 Oleg Dzhimiev
 Electronics Engineer
 phone: +1 801 783  x124
 Elphel, Inc.

___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] questions

2014-04-25 Thread Alexandre Poltorak
Big exposure time ? ;)

Alexandre Poltorak
CTO
 
Office: +41 22 341 3300
Mobile: +41 79 696 4225
Email: a.polto...@foxel.ch
Web: www.foxel.ch

FOXEL SA
Chemin de Champ-Claude 10
1214 Vernier, Geneva, Switzerland

- Mail original -
| De: 201图像所 tuxiangsuo...@gmail.com
| À: Oleg support-list@support.elphel.com
| Envoyé: Vendredi 25 Avril 2014 10:25:56
| Objet: Re: [Elphel-support] questions
| 
| Dear Mrs/Mr,
| 
| I am so glad to receive your timing reply.
| 
| I  also want to ask a question about frame rate.I got to konw that the
| frame rate can get 15fps at full resolution at least from your
| web.Frame rate can get higher to 25fps at least when adjust resolution
| lower .But I see the frame rate is 2fps at full resolution,and when I
| lower the resolution or higher the parameter of frame rate manually,
| the frame rate can not change actully(the vedio of camera is not
| influent at all).Please tell me the reason.
| 
| We hope to receive your favors at early date.Many thanks for your support.
| 
| 2014-04-25 1:08 GMT+08:00 Oleg support-list@support.elphel.com:
|  Hello,
| 
|  Ogg Theora was implemented on the previous camera (model 333) and was never
|  ported to the current one. In 353 camera we use just Motion JPEG (sequence
|  of JPEG frames), either with standard JPEG or the modified version - JP4,
|  that is designed to compress raw Bayer pixels without conversion to YCbCr
|  4:2:0
| 
|  Best regards,
|  Oleg Dzhimiev
|  Electronics Engineer
|  phone: +1 801 783  x124
|  Elphel, Inc.
| 
| ___
| Support-list mailing list
| Support-list@support.elphel.com
| http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com
| 

___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] questions

2014-04-25 Thread Oleg
Hello,

I  also want to ask a question about frame rate.I got to konw that the
 frame rate can get 15fps at full resolution at least from your
 web.Frame rate can get higher to 25fps at least when adjust resolution
 lower .But I see the frame rate is 2fps at full resolution,and when I
 lower the resolution or higher the parameter of frame rate manually,
 the frame rate can not change actully(the vedio of camera is not
 influent at all).Please tell me the reason.


How do you set the frame rate? Free run (default) or TRIG, TRIG_CONDITION
and TRIG_PERIOD parameters?

How do you check frame rate? In the camera control interface?
Sometimes it can be incorrect. If the number is green then exposure is
short enough for the maximum fps. If red - then exposure limits the frame
rate.
What's the exposure time? Set it to 10 ms, for example.

The better way to check the fps:
http://192.168.0.9/parsedit.php?images=4:3:.1

Best regards,
Oleg Dzhimiev
Electronics Engineer
phone: +1 801 783  x124
Elphel, Inc.
___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] questions

2014-04-24 Thread 201图像所
Dear Mrs/Mr,

Now there is a problem bother me a lot.Your web introduce that the
codec algorithm is mix of theora and MJPEG.However,I only see JPEG.So
I want to make sure that which one is right.

We hope to receive your favors at early date.Many thanks for your support.

Rui Song

2014-03-14 15:43 GMT+08:00 201图像所 tuxiangsuo...@gmail.com:
  Dear Mrs/Mr,
 First of all ,thank you for your timely reply.
 I really appreciate the technology of your camera and would like to
 study the codes of it.Now I have download the SVN codes
 (elphel_8.0),but still progress slowly about the structure of the
 camera after a few weeks.Would you have the documents about the
 stucture of camera?I will really appreciate if you can send me
 some.Many thanks for you.

 Rui Song

 2014-03-08 0:58 GMT+08:00 Olga Filippova support-list@support.elphel.com:
 Dear Rui,
 thank you for your purchase of Elphel NC353L camera.
 As I can see on the Aptina web site the data sheet is closed right now ( it
 was open before), but if you search the MT9P031 sensor you can find the
 datasheet available. It is exactly the same sensor - they have changed the
 name for marketing purposes. We used  MT9P031 as well as MT9P006 and there
 is no difference.
 Best Regards,




 On Thu, Mar 6, 2014 at 7:35 PM, 201图像所 tuxiangsuo...@gmail.com wrote:

 Dear Mrs/Mr,

 It's my pleasure to write here for you.

 I am a customer who bought a NC353L camera at the begin of this year.I
 want to study the algorithms of the camera.Now there are some problems
 bother me a lot.I can't download the datasheet of the
 sensor(MT9P006),and I think you will have it,Would you please send me
 one?
 Here is some information to certify my identity of a buyer.
   Date: 01/27/2014
   Invoice No:  NC353-940
   Item desciption: NC353L-369-5C, S/N 00:0E:64:08:1D:0B

 We hope to receive your favors at early date.Many thanks for your support.

 Rui Song



___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] questions

2014-03-14 Thread 201图像所
 Dear Mrs/Mr,
First of all ,thank you for your timely reply.
I really appreciate the technology of your camera and would like to
study the codes of it.Now I have download the SVN codes
(elphel_8.0),but still progress slowly about the structure of the
camera after a few weeks.Would you have the documents about the
stucture of camera?I will really appreciate if you can send me
some.Many thanks for you.

Rui Song

2014-03-08 0:58 GMT+08:00 Olga Filippova support-list@support.elphel.com:
 Dear Rui,
 thank you for your purchase of Elphel NC353L camera.
 As I can see on the Aptina web site the data sheet is closed right now ( it
 was open before), but if you search the MT9P031 sensor you can find the
 datasheet available. It is exactly the same sensor - they have changed the
 name for marketing purposes. We used  MT9P031 as well as MT9P006 and there
 is no difference.
 Best Regards,




 On Thu, Mar 6, 2014 at 7:35 PM, 201图像所 tuxiangsuo...@gmail.com wrote:

 Dear Mrs/Mr,

 It's my pleasure to write here for you.

 I am a customer who bought a NC353L camera at the begin of this year.I
 want to study the algorithms of the camera.Now there are some problems
 bother me a lot.I can't download the datasheet of the
 sensor(MT9P006),and I think you will have it,Would you please send me
 one?
 Here is some information to certify my identity of a buyer.
   Date: 01/27/2014
   Invoice No:  NC353-940
   Item desciption: NC353L-369-5C, S/N 00:0E:64:08:1D:0B

 We hope to receive your favors at early date.Many thanks for your support.

 Rui Song



___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


[Elphel-support] questions

2014-03-06 Thread 201图像所
Dear Mrs/Mr,

It's my pleasure to write here for you.

I am a customer who bought a NC353L camera at the begin of this year.I
want to study the algorithms of the camera.Now there are some problems
bother me a lot.I can't download the datasheet of the
sensor(MT9P006),and I think you will have it,Would you please send me
one?
Here is some information to certify my identity of a buyer.
  Date: 01/27/2014
  Invoice No:  NC353-940
  Item desciption: NC353L-369-5C, S/N 00:0E:64:08:1D:0B

We hope to receive your favors at early date.Many thanks for your support.

Rui Song

___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


[Elphel-support] questions

2014-03-06 Thread 201图像所
Dear Mrs/Mr,

It's my pleasure to write here for you.

I am a customer who bought a NC353L camera at the begin of this year.I
want to study the algorithms of the camera.Now there are some problems
bother me a lot.I can't download the datasheet of the
sensor(MT9P006),and I think you will have it,Would you please send me
one?
Here is some information to certify my identity of a buyer.
  Date: 01/27/2014
  Invoice No:  NC353-940
  Item desciption: NC353L-369-5C, S/N 00:0E:64:08:1D:0B

We hope to receive your favors at early date.Many thanks for your support.

Rui Song

___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


[Elphel-support] questions from elphel_dng.c

2012-10-14 Thread Venkat Subbiah
In elphel_dng.c file I see a reference to a gamma file, what is the purpose
of this file?
  if ((gam = atof(argv[1])) = 0) {
fprintf (stderr, Gamma must be positive!\n);
return 1;
  }

--

First want to figure out how I can take this data and write out
 for (row=0; row  cinfo.image_height; row += 16) {
for (r=0; r  16; )
  r += jpeg_read_scanlines (cinfo, buf+r, 16-r);
for (r=0; r  16; r++) {
  for (col=0; col  cinfo.image_width; col += 16)
for (c=0; c  16; c++)
  out[col+c] = buf[ROT(r)][col+ROT(c)];
  TIFFWriteScanline (tif, out, row+r, 0);
}
  }


I am new to the jpeg API, but In this loop looks like it takes the data
from the JP4 file and writes it out as a tiff file.

I would like to figure out how I could write this out as just sequence of
bytes (or 16-bit words)
corresponding to (width+4)*(height+4), where width and height are
dimensions, rather than converting it to tiff file.
Would this data be be exactly similar to the data from the raw camera file?
___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] questions from elphel_dng.c

2012-10-14 Thread support-list
Venkat,

I'm not the author of that program, but yes - it seems to convert the file and 
save it as Tiff. I'm not sure if it process metadata needed for the correct 
convertion (our ImageJ plugin un-applies all processing between the raw pixels 
and the camera output (i.e. sensor internal analog gain for each channel).

Andrey

 On Sun, 14 Oct 2012 00:20:24 -0700 Venkat Subbiahven...@magnasinc.com 
wrote  

  In elphel_dng.c file I see a reference to a gamma file, what is the purpose 
  of this file? 
if ((gam = atof(argv[1])) = 0) {
  fprintf (stderr, Gamma must be positive!#92;n);
   return 1;
}
  
  
  --
  
  
  First want to figure out how I can take this data and write out 
for (row=0; row  cinfo.image_height; row += 16) {
  for (r=0; r  16; )
r += jpeg_read_scanlines (cinfo, buf+r, 16-r);
  for (r=0; r  16; r++) {
for (col=0; col  cinfo.image_width; col += 16)
   for (c=0; c  16; c++)
 out[col+c] = buf[ROT(r)][col+ROT(c)];
TIFFWriteScanline (tif, out, row+r, 0);
   }
}
  
  
  
  
  I am new to the jpeg API, but In this loop looks like it takes the data from 
  the JP4 file and writes it out as a tiff file.
  
  
  I would like to figure out how I could write this out as just sequence of 
  bytes (or 16-bit words) 
   corresponding to (width+4)*(height+4), where width and height are 
  dimensions, rather than converting it to tiff file.
  Would this data be be exactly similar to the data from the raw camera file?
   ___ 
  Support-list mailing list 
  Support-list@support.elphel.com 
  http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com 
  


___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] Questions about JP4

2011-11-30 Thread Florent Thiery
Hi,


- (how) can i render a JP4 image to RGB using ImageJ plugins ? I can
successfully import it (but it is still gray with squares)

 Converting from Bayer mosaic is not a trivial task, there is no single
 best conversion - there are many different de-mosaic or de-bayer
 algorithms, each with different features. Some may work better for nature
 photos, some - for photos with multiple straight and parallel lines (such
 as text, or photos of man-made structures, like buildings or fences). I was
 able to find some ImageJ de-bayer algorithms, but they were rather basic.


Okay, no worries.


- is there a C implementation somewhere similar to our JP46 to bayer

 codehttp://code.google.com/p/gst-plugins-elphel/source/browse/trunk/jp462bayer/src/gstjp462bayer.c#856
  so
that we can port it ? If not, where can we have more information regarding
the difference between JP46 and JP4, and the according decoding 
 adjustments
needed ?

 Yes, there is a C implementation of the JP4 conversion, but I would
 recommend you to keep an eye on the java implementation in ImageJ plugin.


Alright, so to sum up the ImageJ JP4/JP46 importer plugin does the JP4* to
Bayer conversion. We will take a look. The reason we need a C
implementation is for inclusion into a gstreamer plugin.


 Once in a while we add more parameters that are passed from the camera via
 Exif MakerNote  field, and the ImageJ implementation is always maintained
 to match the latest camera firmware.


Does the JP4 to Bayer step require to parse the parameters (which AFAIK is
not the case with JP46) ?


 You may find C-code here: http://wiki.elphel.com/index.php?title=Movie2dng(or 
 just search for movie2dng ) , but it may be a little behind the latest
 MakerNote.


Thanks

Florent
___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] Questions about JP4

2011-11-30 Thread Florent Thiery
Funny, when reading Movie2DNG's src/jp4.cpp:

  // EXIF
  readMakerNote();

  // JP4 deblocking
  // from
http://code.google.com/p/gst-plugins-elphel/source/browse/trunk/jp462bayer/src/gstjp462bayer.c

Is the JP4 to Bayer the same as JP46 to Bayer ?
___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] Questions about JP4

2011-11-30 Thread Andrey Filippov
On Wed, Nov 30, 2011 at 4:20 AM, Florent Thiery
florent.thi...@ubicast.euwrote:

 Funny, when reading Movie2DNG's src/jp4.cpp:

   // EXIF
   readMakerNote();

   // JP4 deblocking
   // from
 http://code.google.com/p/gst-plugins-elphel/source/browse/trunk/jp462bayer/src/gstjp462bayer.c

 Is the JP4 to Bayer the same as JP46 to Bayer ?



Florent, JP4 and JP46 are handled very similarly. The important part is
handled the in the same way - un-applying gamma conversion according to
MakerNote data (the analog gains are also un-applied according to
MakerNote, but that is less critical). And I believe that after that step
and before any other processing (de-mosaic, aberration correction, ...) the
data should be kept with more than 8 bits per pixel. Theoretically it is
possible to pass gamma conversion data to the DNG (as was already done),
but in that case there are more possibilities for something getting wrong
with later conversion. So I would prefer to use linear 16+ bit intermediate
(after the JP4/JP46 specific processing) Bayer data.

Andrey
___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] Questions about JP4

2011-11-29 Thread Andrey Filippov
On Tue, Nov 29, 2011 at 10:44 AM, Florent Thiery
florent.thi...@ubicast.euwrote:

 Hello

- is a (jpeg) compression applied to the images when grabbing a JP4
stream ? In other words, is the compression rate parameter having an effect
in JP4 mode ?

 Florent, yes, JPEG compression is applied and the quality parameters is
applicable. This quality is calculated according to JPEG suggested
formulas. It is more tricky but still possible to use arbitrary
quantization tables. When you specify quality 100% there is no
quantization, but the files will be large.


- (how) can i render a JP4 image to RGB using ImageJ plugins ? I can
successfully import it (but it is still gray with squares)

 In ImageJ you are getting just raw Bayer mosaic data - same as the sensor
outputs. That is the whole goal of the JP4 format - preserve as much as
possible of the actual senor data while providing reasonable
compression(determined by your bandwidth budget). Normal JPEG is designed
primarily for image distribution, JP4 modification is camera-centric,
providing a way to get data from the sensor to the post-processing computer.

Converting from Bayer mosaic is not a trivial task, there is no single
best conversion - there are many different de-mosaic or de-bayer
algorithms, each with different features. Some may work better for nature
photos, some - for photos with multiple straight and parallel lines (such
as text, or photos of man-made structures, like buildings or fences). I was
able to find some ImageJ de-bayer algorithms, but they were rather basic.

When processing Eyesis images each color component is first processed for
aberration correction separately as there is always some of chromatic
aberration in the images. At the next stage the colors are combine with the
frequency-domain algorithm we called spectral scissors - aliases are
removed according to the nature of the image, and the shape of the
separating cuts for red and blue components (they have less resolution
than green and so closer/more overlapping aliases) are calculated using the
information from the green channel. This implementation is designed to work
on subdivided pixel grid - during the aberration correction we convert each
color component to 20MPix images (2x in each direction) and then reduce
resolution to the original only on the very last stage of the processing.

You may find illustration of this de-mosaic process in the blog post -
http://blog.elphel.com/2010/11/zoom-in-now-enhance/ (around last 1/3 of it)
, the image itself is here -
http://blogs.elphel.com/wp-content/uploads/2010/11/scissors.png  It is
possible to run this algorithm on JP4 images without aberration correction
with
http://elphel.git.sourceforge.net/git/gitweb.cgi?p=elphel/ImageJ-Elphel;a=blob(correction
requires kernels that are calculated with
Aberration_Calibration.java





- is there a C implementation somewhere similar to our JP46 to bayer

 codehttp://code.google.com/p/gst-plugins-elphel/source/browse/trunk/jp462bayer/src/gstjp462bayer.c#856
  so
that we can port it ? If not, where can we have more information regarding
the difference between JP46 and JP4, and the according decoding adjustments
needed ?


Yes, there is a C implementation of the JP4 conversion, but I would
recommend you to keep an eye on the java implementation in ImageJ plugin.
Once in a while we add more parameters that are passed from the camera via
Exif MakerNote  field, and the ImageJ implementation is always maintained
to match the latest camera firmware. And it still is compatible with the
older firmware releases - so far we were appending new data to the end of
the MakerNote, so the beginning of it stays the same.

You may find C-code here:
http://wiki.elphel.com/index.php?title=Movie2dng(or just search for
movie2dng ) , but it may be a little behind the latest
MakerNote.

Andrey





___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] Questions about the sensor in Elphel353

2011-10-26 Thread Diana Carrigan
Hello,

Do all of the JP4 modes listed in the below article (under Different JP4 Modes 
in 8.x Software) bypass the Demosaic in the FPGA. If not, which do and which 
don’t.

Best Regards,

Diana Carrigan

From: Alexandre Poltorak [mailto:po...@alsenet.com]
Sent: Tuesday, September 06, 2011 12:55 PM
To: Diana Carrigan
Subject: Re: [Elphel-support] Questions about the sensor in Elphel353

Hi,

Please take a look here : http://wiki.elphel.com/index.php?title=JP4

JP4 is Elphel's RAW format. Disk Recorder can record even long videos in JP4.

Conversion must be done on a computer in post-processing.

Best regards,
Alexandre

De: Diana Carrigan dianaejohn...@maxim-ic.com
À: support-list@support.elphel.com
Envoyé: Mardi 6 Septembre 2011 20:49:13
Objet: [Elphel-support] Questions about the sensor in Elphel353


Hello,

We have a couple of questions I’m sure you guys can answer.


1)  What is the sensor data format defaulted to? (RGB? YUV?)

2)  Does the sensor do an RGB to YUV conversion?

3)  Can the Disk Recorder (or some other program) be used to create a data 
file of the video stream of the RAW RGB data (perhaps 5 seconds worth of video 
stream)?
Thanks for your help..

Diana Carrigan

___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com

___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


[Elphel-support] Questions about the sensor in Elphel353

2011-10-26 Thread Andrey Filippov
Diana,

Yes - all of them. I would recommend using just jp4 (color mode 5) in the
camera drop-down menu. You may also use JP46 (color mode 4) - it is the
easiest to open with the programs as if it was a regular JPEG (i.e. in the
browser). There will be distortions, but image as a whole will look like
original. JP4 has an advantage that you can run full resolution 5MPix images
at full sensor rate, while JP46 will limit frame rate to 10.6 fps.

It is causes by dummy color components in JP46 (for compatibility with color
JPEG), and FPGA compressor code requires 2 cycles per pixel (clock is
160MHz). In JP4 mode each source pixel produces 1 compressed sample, in JP46
(as in color YCbCr 4:2:0) - 1.5Pix (for each 16x16 macroblock it compresses
4 of Y 8x8 blocks, one Cb 8x8 block and one Cr 8x8 block).

So compressor pixel rate (referenced to sensor pixels) will be
160MHz/2=80MPix/sec for JP4 and 160MHz/3~=53MPix/sec for JPEG and JP46.
For 2592*1936 images that results in 15.94fps for JP4 and 10.63fps for
JPEG/JP46. 15.96fps is above the sensor capabilities, so in that case the
frame rate is limited by the sensor (if not network/hdd, of course, but that
depends on the nature of the image, compression quality and coring
settings).

Both JP4 and JP46 are supported by ImageJ plugin (
http://elphel.git.sourceforge.net/git/gitweb.cgi?p=elphel/ImageJ-Elphel;a=blob;f=JP46_Reader_camera.java),
plugin also reads custom MakerNote Exif data that contains gamma
conversion settings in the camera as well as analog gains, so the result
opened image (as 32-bit floating point) has the pixel values proportional to
raw sensor ones. Plugin can read saved files or get images directly from the
camera.  And it as other Elphel software is released under GNU GPLv3
license, so you can use the available source code to write your own program
(i.e. in different programming language).

Andrey




On Wed, Oct 26, 2011 at 9:40 AM, Diana Carrigan
dianaejohn...@maxim-ic.comwrote:

 Hello,

 ** **

 Do all of the JP4 modes listed in the below article (under Different JP4
 Modes in 8.x Software) bypass the Demosaic in the FPGA. If not, which do and
 which don’t.

 ** **

 Best Regards,

 ** **

 Diana Carrigan

 ** **

 *From:* Alexandre Poltorak [mailto:po...@alsenet.com]
 *Sent:* Tuesday, September 06, 2011 12:55 PM
 *To:* Diana Carrigan
 *Subject:* Re: [Elphel-support] Questions about the sensor in Elphel353***
 *

 ** **

 Hi,

 ** **

 Please take a look here : http://wiki.elphel.com/index.php?title=JP4

 ** **

 JP4 is Elphel's RAW format. Disk Recorder can record even long videos in
 JP4. 

 ** **

 Conversion must be done on a computer in post-processing.

 ** **

 Best regards,

 Alexandre
 --

 *De: *Diana Carrigan dianaejohn...@maxim-ic.com
 *À: *support-list@support.elphel.com
 *Envoyé: *Mardi 6 Septembre 2011 20:49:13
 *Objet: *[Elphel-support] Questions about the sensor in Elphel353


 

 Hello,

  

 We have a couple of questions I’m sure you guys can answer.

  

 1)  What is the sensor data format defaulted to? (RGB? YUV?)

 2)  Does the sensor do an RGB to YUV conversion?

 3)  Can the Disk Recorder (or some other program) be used to create a
 data file of the video stream of the RAW RGB data (perhaps 5 seconds worth
 of video stream)?

 Thanks for your help..

  

 Diana Carrigan


 ___
 Support-list mailing list
 Support-list@support.elphel.com
 http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com
 

 ** **

 ___
 Support-list mailing list
 Support-list@support.elphel.com
 http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] Questions about the sensor in Elphel353

2011-09-06 Thread Andrey Filippov
On Tue, Sep 6, 2011 at 12:49 PM, Diana Carrigan
dianaejohn...@maxim-ic.comwrote:

 Hello,

 ** **

 We have a couple of questions I’m sure you guys can answer.

 ** **

 **1)  **What is the sensor data format defaulted to? (RGB? YUV?)

 **2)  **Does the sensor do an RGB to YUV conversion?

Diana,

Sensor data format (if you really mean the 'sensor', not the camera) is
Bayer encoded 12-bit data - you may find more details in the sensor
datasheet on Aptina web site (MT9P031). But if you mean FPGA/camera, than
there are several image formats available. Default is regular JPEG with
YCbCr 4:2:0, in that case the FPGA performs simple (bi-linear) demosaic
combined with RGB-YUV conversion

For higher quality we use JP4 format (explained here -
http://community.elphel.com/jp4/jp4demo.php ), with the compression set to
100% it preserves virtually all the data available from the sensor and
making it possible to fit the sensor output into the transmission/recording
bandwidth of the camera.


 

 **3)  **Can the Disk Recorder (or some other program) be used to
 create a data file of the video stream of the RAW RGB data (perhaps 5
 seconds worth of video stream)?

We have possibility top acquire raw frames from the camera, but that is only
for testing purposes, to make sure the compressed output matches the sensor
data. When compressing the raw data in JP4 format we keep compression errors
below the sensor noise floor, so using the actual raw data does not provide
you with more information. And it is not possible to record full
resolution/frame rate data in the camera in raw format - it would require
higher bandwidth but no image improvement.

JP4 requires post-processing and you may use ImageJ program with our plugin
(
http://elphel.git.sourceforge.net/git/gitweb.cgi?p=elphel/ImageJ-Elphel;a=blob;f=JP46_Reader_camera.java)
that reads the JP4 images (you can split the JP4-based video file into
frames -
http://elphel.git.sourceforge.net/git/gitweb.cgi?p=elphel/miscellaneous;a=blob_plain;f=split_mov.php)
together with Exif data (including custom MakerNote that includes
additional camera settings information) and converts them to raw Bayer
mosaic data un-applying in-camera conversion (such as analog per-channel
gains, gamma correction) and resulting in floating point TIFF images that
can be processed by different programs.

We also have programs to convert JP4 movies into regular ones.

Andrey



 

 Thanks for your help..

 ** **

 Diana Carrigan


___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] Questions regarding zoom in ... now enhance blog post

2011-04-13 Thread Andrey Filippov
On Tue, Apr 12, 2011 at 7:28 AM, Florent Thiery
florent.thi...@ubicast.euwrote:

 Hello,

 First, please let me introduce our use case; we are trying to use Elphel
 cameras (353 + Computar 4-8mm 1/2) to get maximum resolution ( FullHD), 25
 fps video.

 Our current problems are about image quality when zooming on details
 (blurred images); in other terms, we are trying to improve the rendering
 quality as much as possible to have cleaner images. In this context, last
 year we implemented http://code.google.com/p/gst-plugins-elphel/ but only
 changing the debayering algorithm did not improve the quality enough for our
 application (at least not when compared to the processing overhead).

 I was wondering about the method described in the awesome article Zoom in
 ... now enhance:

- are there any specifics about the method being for Eyesis only (my
guess is it's not) ?

 Florent, sure it can be used with as single camera too. But it may be too
slow for processing videos - it now takes 2-3 minutes/frame on an i7 with 8
GB of RAM (in multi-threaded mode)


- regarding the calibration, what are the invariable factors ? Is the
calibration required for:
   - every camera model/generation (depending on camera/sensor
   manufacturing design/process variations) ?
   - every lens model (depending on lens model) ?
   - every lens tuning (zoom level / focus / iris ...) ?
   - climatic condition changes (temperature, ...) ?

 This was designed for fixed everything (though we did not notice
degradation with changing temperatures). But there is a significant
difference between lenses even the same model. The lenses we used do not
have zoom, and iris does not make much sens for such lenses - with 2.2 um
pixels you can not close lens more than ~4-5.6 because of the diffraction,
small sensors do not provide much control over DoF and using iris to limit
light - not really needed with ERS sensors - they can handle very short
exposures perfectly. And of course, focus setting would influence results
too, as well as the lens orientation.


- the hidden question behind this is: how can this technique be used in
production ?


Working with wide angle lenses ( 45x60 degrees FOV) we do not have enough
room to capture the test pattern in a single shot, so software is able to
combine multiple shots where the test pattern covers just a fraction of
frame. We did not work on optimizing computational time of the calibration,
so it takes several hours to process data ftom 8 camera modules. Current
calibration is only designed for the aberrations correction, but I'm now
working on the precise distortions calibration (with some 1/10 pixel
precision), we plan to use it for panorama stitching, it can also be used
for making measurements with the camera. This calibration will use the same
pattern we use fro aberration correction, just at closer range, so the
pattern will cover the whole FOV (minor out of focus is not a problem here).


- For a given camera/lens combination, could a public database of
   tuning data reduce the calibration requirement (in a similar fashion to
   A-GPSes which download correction data from the network to increase
   performance on low-quality reception and/or chips
   http://en.wikipedia.org/wiki/Assisted_GPS) ?

 Our goal was to have precise individual lens correction, we did not
experiment with correction of the lens model - it probably is possible, but
with less correction, of course. Software has multiple tuning parameters, it
should be possible to do that.


- is there a hope of having such a feature (in the long term)
   integrated in the camera itself (i.e. grabbing an mjpeg stream who had 
 the
   corrections made right before the encoding) ?


Not in the near future, at least. We now heavily rely on post-processing,
camera role is just to capture all what sensor can provide, in as much raw
form, as possible.

Andrey


 Thanks

 Florent

 ___
 Support-list mailing list
 Support-list@support.elphel.com
 http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


[Elphel-support] Questions regarding zoom in ... now enhance blog post

2011-04-12 Thread Florent Thiery
Hello,

First, please let me introduce our use case; we are trying to use Elphel
cameras (353 + Computar 4-8mm 1/2) to get maximum resolution ( FullHD), 25
fps video.

Our current problems are about image quality when zooming on details
(blurred images); in other terms, we are trying to improve the rendering
quality as much as possible to have cleaner images. In this context, last
year we implemented http://code.google.com/p/gst-plugins-elphel/ but only
changing the debayering algorithm did not improve the quality enough for our
application (at least not when compared to the processing overhead).

I was wondering about the method described in the awesome article Zoom in
... now enhance:

   - are there any specifics about the method being for Eyesis only (my
   guess is it's not) ?
   - regarding the calibration, what are the invariable factors ? Is the
   calibration required for:
  - every camera model/generation (depending on camera/sensor
  manufacturing design/process variations) ?
  - every lens model (depending on lens model) ?
  - every lens tuning (zoom level / focus / iris ...) ?
  - climatic condition changes (temperature, ...) ?
   - the hidden question behind this is: how can this technique be used in
   production ?
  - For a given camera/lens combination, could a public database of
  tuning data reduce the calibration requirement (in a similar fashion to
  A-GPSes which download correction data from the network to increase
  performance on low-quality reception and/or chips
  http://en.wikipedia.org/wiki/Assisted_GPS) ?
  - is there a hope of having such a feature (in the long term)
  integrated in the camera itself (i.e. grabbing an mjpeg stream
who had the
  corrections made right before the encoding) ?


Thanks

Florent
___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] Questions about 10359 and wavelet transformatoin

2011-01-10 Thread Jens B�rger
 Hi Jens,
 
 - Where are the registers located, which are accessible via the
 reg_read-command?
 It's I2C bus.
 0x48xx (0x90xx in simulation) - sensor - check datasheet.
 0x08xx (0x10xx in simulation) - 10359 board - for read 8-bit addresses
have
 a look in the end of x359.v, for write - addresses described as
parameters
 in the same file.

So with the lower 7 bits of the i2c-address I can control what data is assigned 
to
the i2c_do, right?
i2c_do has 32 bits and the possibilities are the wires assigned in the 
preceeding
lines, like for example

wire [31:0] r_02 = {16'b0,ddr_do[15:0]};

So data is read from the SDRAM via channel 5 and in 2-byte-blocks?

I didn't really find where one can choose the address to be read from, and what 
do
you mean by 8-bit-addresses? I see that there are only 8 bits left in the 
i2c-reg.

How I can access the memory? Are there hints what the memory layout is like 
resp. how mcontr does abstract from the SDRAM?

 no buffering - the direct channels are enabled/disabled in turns - not
 synchronized.

What does not synchronized mean?

Thanks in advance,
Jens

___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] Questions about CCD Horizontal CLK PCB routing in 10342

2010-12-13 Thread Andrey Filippov
Shawyoo,

major notes about the CCD board PCB designing is probably something too
broad that I can help you. I do not have such notes ready, I can provide
some comments on that my 10342 (now rather old) design, but I need to be
sure it will be used for the design with a compatible license. If your
project is an open design, you may post a link to your circuit diagrams, I
can look at them and maybe will be able to give you some advice.

Andrey
___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] Questions about 10359 and wavelet transformatoin

2010-12-13 Thread Oleg K Dzhimiev
Hi Jens,

- Where are the registers located, which are accessible via the
 reg_read-command?

It's I2C bus.
0x48xx (0x90xx in simulation) - sensor - check datasheet.
0x08xx (0x10xx in simulation) - 10359 board - for read 8-bit addresses have
a look in the end of x359.v, for write - addresses described as parameters
in the same file.
If your 10359 is revision A (should be):
0x50xx - EEPROM on the 10359a.
0x69xx - clock generator on the 10359a.


 - How could we access 1359's DDR-RAM via read/write from the Linux-system?
  We thought about implementing control operations via the webinterface, and
 perhaps
  reading whole images saved / buffered in the DDR

 The possible solution is to transfer the data from 10359's DDR SDRAM as a
frames with sync signals and then read it from 10353's SDRAM.


 - As we do not need more than one image-sensor, we thought about using one
 of the possible
  alternation modes to let do the memory accessing for us somehow
 automatically, thus
  that wie connect a wavelet-transformation-module via the PXD-lines of one
 of the other
  sensor-ports

The 1st frame normal, the second - transformed version of the 1st? Sounds
ok.


 - What does the combine into one frame option do? Does ist somehow add
 the last retrived
  images of all (connected) sensors?

Combine into one - takes images from2 sensors and glues them vertically in
one: http://blogs.elphel.com/2009/12/10359-stereo-combining-modes/


 - What is the difference between the pure alternation mode and the one
 with buffering?

with buffering means that the images from sensors are taken at the same
moment of time and while the first one is being transferred to the system
board the second one is being buffered and transfered after the 1st frame is
finished.
no buffering - the direct channels are enabled/disabled in turns - not
synchronized.

If you have other questions please ask.

Best regards,
Oleg
___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] questions about 353 camera

2010-09-10 Thread Schuster, Sebastian
Hi Sebastian,
thank you for your answer.

Low latency is also important for us, thanks for the tip.

I have experienced, that my camera has a very low frame rate ~3 frames in 
default configuration (using the Camera Control Interface video image and VLC). 
Decreasing the resolution did not increased the frame rate very much, maybe 
just about to 5 frames.

I also tried different kinds of connections, a point to point connection via 
cross over cable and one using a switch. Both with the same slow result.

What can be done to increase the frame rate?

Regards,
Sebastian





-Ursprüngliche Nachricht-
Von: Sebastian Pichelhofer [mailto:sebastian.pichelho...@gmail.com]
Gesendet: Donnerstag, 9. September 2010 18:41
An: Schuster, Sebastian
Cc: support-list@support.elphel.com
Betreff: Re: [Elphel-support] questions about 353 camera


Ah and there is also the option to change the default (boot-time)
parameters using the Parameter Editor.

Regards Sebastian

On Thu, Sep 9, 2010 at 18:29, Sebastian Pichelhofer
sebastian.pichelho...@gmail.com wrote:
 On Thu, Sep 9, 2010 at 17:13, Schuster, Sebastian
 sebastian.schus...@kmweg.de wrote:
 Hello,

 I have some question about my 353 camera:

 My first question is about the Camera Control Interface. When I change the
 settings for the resolution, can I save them on the camera or do I have to
 change them everytime when the camera is powered up?

 They are reset to default values at boot time.

 You can add a startup script that sets the desired parameters on boot
 time though.


 Second one:
 Using the command line, e.g. vlc rtsp://192.168.0.9:554 -V x11, is there a
 list of commands I can use? Are these commands related to the VLC plaver or
 to the camera or both?

 Not related to the camera. Best check the VLC manual or website for a
 list of possible commands.

 One that is very important for low latency streaming from Elphel
 cameras is --rtsp-caching (integer)

Sets the default caching (buffering) value for RTSP streams, in
 milliseconds.

 If I remember correctly the default value is around 5000 (5 seconds).
 Depending on resolution/framerate/compression settings of the camera
 you can go down to as low as 10-20 which basically gives realtime
 video (lower latency than what we achieved with any other player like
 mplayer or gstreamer)

 Regards Sebastian


 Regards,
 Sebastian


 Krauss-Maffei Wegmann GmbH  Co. KG
 Sitz der Gesellschaft ist München
 Registergericht: Amtsgericht München, HRA 72 460

 Persönlich haftende Gesellschafterin: Krauss-Maffei Wegmann Verwaltungs GmbH
 Sitz der Gesellschaft ist München
 Registergericht: Amtsgericht München, HRB 118952
 Geschäftsführer: Dipl.-Ing. Frank Haun (Vorsitzender), Dipl.-Kfm. Stefan
 Krischik, Dipl.-Ing. Jürgen Weber
 Vorsitzender des Aufsichtsrates: Johannes Schmidt


 ___
 Support-list mailing list
 Support-list@support.elphel.com
 http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com





Krauss-Maffei Wegmann GmbH  Co. KG
Sitz der Gesellschaft ist München
Registergericht: Amtsgericht München, HRA 72 460

Persönlich haftende Gesellschafterin: Krauss-Maffei Wegmann Verwaltungs GmbH 
Sitz der Gesellschaft ist München 
Registergericht: Amtsgericht München, HRB 118952
Geschäftsführer: Dipl.-Ing. Frank Haun (Vorsitzender), Dipl.-Kfm. Stefan 
Krischik, Dipl.-Ing. Jürgen Weber
Vorsitzender des Aufsichtsrates: Johannes Schmidt

___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


[Elphel-support] questions about 353 camera

2010-09-09 Thread Schuster, Sebastian
Hello,

I have some question about my 353 camera:

My first question is about the Camera Control Interface. When I change the 
settings for the resolution, can I save them on the camera or do I have to 
change them everytime when the camera is powered up?

Second one:
Using the command line, e.g. vlc rtsp://192.168.0.9:554 -V x11, is there a list 
of commands I can use? Are these commands related to the VLC plaver or to the 
camera or both?


Regards,
Sebastian




Krauss-Maffei Wegmann GmbH  Co. KG
Sitz der Gesellschaft ist München
Registergericht: Amtsgericht München, HRA 72 460

Persönlich haftende Gesellschafterin: Krauss-Maffei Wegmann Verwaltungs GmbH 
Sitz der Gesellschaft ist München 
Registergericht: Amtsgericht München, HRB 118952
Geschäftsführer: Dipl.-Ing. Frank Haun (Vorsitzender), Dipl.-Kfm. Stefan 
Krischik, Dipl.-Ing. Jürgen Weber
Vorsitzender des Aufsichtsrates: Johannes Schmidt

___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] questions about 353 camera

2010-09-09 Thread Sebastian Pichelhofer
Ah and there is also the option to change the default (boot-time)
parameters using the Parameter Editor.

Regards Sebastian

On Thu, Sep 9, 2010 at 18:29, Sebastian Pichelhofer
sebastian.pichelho...@gmail.com wrote:
 On Thu, Sep 9, 2010 at 17:13, Schuster, Sebastian
 sebastian.schus...@kmweg.de wrote:
 Hello,

 I have some question about my 353 camera:

 My first question is about the Camera Control Interface. When I change the
 settings for the resolution, can I save them on the camera or do I have to
 change them everytime when the camera is powered up?

 They are reset to default values at boot time.

 You can add a startup script that sets the desired parameters on boot
 time though.


 Second one:
 Using the command line, e.g. vlc rtsp://192.168.0.9:554 -V x11, is there a
 list of commands I can use? Are these commands related to the VLC plaver or
 to the camera or both?

 Not related to the camera. Best check the VLC manual or website for a
 list of possible commands.

 One that is very important for low latency streaming from Elphel
 cameras is --rtsp-caching (integer)

    Sets the default caching (buffering) value for RTSP streams, in
 milliseconds.

 If I remember correctly the default value is around 5000 (5 seconds).
 Depending on resolution/framerate/compression settings of the camera
 you can go down to as low as 10-20 which basically gives realtime
 video (lower latency than what we achieved with any other player like
 mplayer or gstreamer)

 Regards Sebastian


 Regards,
 Sebastian


 Krauss-Maffei Wegmann GmbH  Co. KG
 Sitz der Gesellschaft ist München
 Registergericht: Amtsgericht München, HRA 72 460

 Persönlich haftende Gesellschafterin: Krauss-Maffei Wegmann Verwaltungs GmbH
 Sitz der Gesellschaft ist München
 Registergericht: Amtsgericht München, HRB 118952
 Geschäftsführer: Dipl.-Ing. Frank Haun (Vorsitzender), Dipl.-Kfm. Stefan
 Krischik, Dipl.-Ing. Jürgen Weber
 Vorsitzender des Aufsichtsrates: Johannes Schmidt


 ___
 Support-list mailing list
 Support-list@support.elphel.com
 http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com




___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


[Elphel-support] Questions about FPGA

2010-06-01 Thread Gladys Gladys
Hi all,

I ‘m studying the H:W part of Elphel project and here I have some questions.
What I'm trying to do is:

First I want to setup the interface between sensor and FPGA, get images from
sensor and send them into FPGA; then I want to do some image processing in
FPGA.

I just have a CMOS sensor with demo board, but I don't have the FPGA. Is
there any way I can test my Verilog code to make sure if it satisfies my
purpose? I am using ModelSim to do some simulation but I don't think it
works as I don't have realtime image output from sensor.

Another question is should the image processing be done after I store each
frame of image in the memory? Or it's done simultaneously when FPGA receives
the images? Your helps will be highly appreciated, Thanks.


Yuhui
___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com


Re: [Elphel-support] Questions about FPGA

2010-06-01 Thread Andrey Filippov
Yuhui

Our software distribution available at Sourceforge includes the simulation
scripts and used models. We use Icarus Verilog and GTKWave for the
simulation and we do not support ModelSim simuilator. Unfortunately even
simulation requires Verilog code of Xilinx primitives that we can not
re-distribute because of the Xilinx licensing, and some of those primitives
require patches to work properly with Icarus.

So I would  recommend to make a complete SDK installation as described on
http://wiki.elphel.com/index.php?title=Elphel_Software_Kit_for_Ubuntu (with
minor modifications it works with (K)Ubuntu 10.4 also). If you have Xilinx
WebPack installed on your computer, the software build script would detect
the installation, copy unisims library to the installation directory and
patch it. After that you'll be able to run x353_sim.sh script that simulates
the design.

Image processing in the camera is performed in several steps, all taking
place simultaneously in the FPGA. When FPGA receives the sensor data it can
combine it with the FPN data stored in the SDRAM, scale the data, optionally
perform lens vignetting correction, perform gamma conversion and save data
in the SDRAM memory attached to the FPGA. Simultaneously histograms are
calculated.

As soon as there are more than 20 scan lines available in the memory, the
data is read to the FPGA in 20x20 pixel overlapping blocks, the data is
optionally subject to color interpolation, then it goes through the JPEG
compression stages (DCT, quantization, huffman encoding, bit stuffer). The
result compressed data is transferred to the system memory using DMA.

Andrey
___
Support-list mailing list
Support-list@support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com