Re: [linux-dvb] How to gather Signal Quality information from DVB drivers

2007-07-30 Thread Morgan Tørvolt
Hi.

Tecnically signal quality should be possible to get from eighter ber,
unc or snr. Signal level is not really important, because a low noise
floor with a weak signal could give you just as good a reception as a
strong signal with a strong noise floor. What you as an end user wants
is a bit-error free signal to your decoder. What the signal level is
like is not interesting.

In theory, snr, ber and unc are directly linked. You can use a formula
to calculate ber and unc from snr or any other way (ber-snr,
unc-snr, unc-ber and so on) if you have the modulation type, symbol
rate and other information. Since ber and unc can be zero (unc when
reception is ok, ber will be zero if the reception is perfect ie. no
need for fec or reed-solomon), the ber could be a good indicator.
ber=0 (in a given timeframe, say 10sec), should be an indicator that
the signal is very good. You will need to use snr to be able to give a
correct result for very good signal levels (better signal quality than
what is needed to receive the stream perfectly)

your real problem is not the quality part of this equation really. It
is the fact that all drivers implement this differently. A level of
0x2a35 on snr could mean perfect signal on one driver, and no
reception at all on a second driver. Some cards dont even give you
snr, maybe even a hardware constraint. Actually this will be different
from card to card as well (with the same frontend), given that there
is a different sensitivity on the tuner. Getting this right is really
a very difficult task right now, unless I am totally off here.

-Morgan-


On 30/07/07, Chun Chung LO [EMAIL PROTECTED] wrote:
 Hi all,

 I wonder how I can gather the DVB-T signal quality information from the
 status value reported by driver (such as BER, Signal strength, S/N, UNC,
 etc).

 As I know I cannot simply use S/N as signal quality as it is only a
 ratio. But how to define signal quality of the DVB-T signal ? High
 signal strength ? Low BER? Low UNC? Or something else ?

 As I need to develop a simple callback to let GUI/application to get a
 value of signal quality in percentage. All I know is I have some driver
 status reading, but I do not know how to construct a percentage based
 DVB-T signal quality.

 (My hardware is TD1316 + TDA10046 + SAA7134)

 Please help.

 Best regards,
 Lo Chun Chung (Chung)
 [EMAIL PROTECTED] HK.

 ~
 This message (including any attachments) is for the named
 addressee(s)'s use only. It may contain sensitive, confidential,
 private proprietary or legally privileged information intended for a
 specific individual and purpose, and is protected by law. If you are
 not the intended recipient, please immediately delete it and all copies
 of it from your system, destroy any hard copies of it
 and notify the sender. Any use, disclosure, copying, or distribution of
 this message and/or any attachments is strictly prohibited.
 


 ___
 linux-dvb mailing list
 linux-dvb@linuxtv.org
 http://www.linuxtv.org/cgi-bin/mailman/listinfo/linux-dvb


___
linux-dvb mailing list
linux-dvb@linuxtv.org
http://www.linuxtv.org/cgi-bin/mailman/listinfo/linux-dvb


Re: [linux-dvb] How to gather Signal Quality information from DVB drivers

2007-07-30 Thread Morgan Tørvolt
Hi Wolfgang,

 the SNR value is only dependent on the demodulator you use, so even with
 a different frontend, the values can be directly compared as long as you
 use the same demodulator IC (and firmware, maybe...).
 For most demodulators, you can give a (more or less) simple equation to
 calculate the S/N (or C/N, whatever you want to call it) from the register
 values, but unfortunately there are some exceptions: STV0297 DVB-C for
 example.
You are right. What I said was very wrong. I did think correctly
though, I was just thinking without writing it down =)
What I meant to say was, that even with the same frontend, two cards
synced on the same signal from the same antenna could have different
snr as a result of the sensitivity of the tuner and possibly also the
shielding and the noise coming from the tuner and the signal amplifier
itself.
That said, the snr is probably the best data you have on the quality.
Unfortunately devices like avemerdia volar A808 (I have one of those)
does not give you snr at all. That is very unfortunate. In such a case
the best thing for the driver to do is to calculate the snr from the
ber. It will not be accurate though.
From a professional point of view, the actual BER could be the most
important one, but also that is implemented differently, probably in
hardware as well as in the drivers.

 The signal strength, in turn, is most often not even useful when comparing
 devices of the same type - the tolerances are too big, you can only get
 a rough idea of any signal present or no signal present. (Which is,
 BTW, sometimes very nice too have, too.)
Absolutely!

 Regards,
 Wolfgang

-Morgan-

___
linux-dvb mailing list
linux-dvb@linuxtv.org
http://www.linuxtv.org/cgi-bin/mailman/listinfo/linux-dvb


[linux-dvb] Patch adding extern C to lib/libdvben50221/en50221_stdcam.h

2007-06-27 Thread Morgan Tørvolt

Hi.

This has given me some grey hairs a couple of times. Mentioned in
#linuxtv earlier, but this time I send a patch. Thanks for all help!

-Morgan-


Morgan.patch
Description: Binary data
___
linux-dvb mailing list
linux-dvb@linuxtv.org
http://www.linuxtv.org/cgi-bin/mailman/listinfo/linux-dvb

Re: [linux-dvb] Problem with TT C1500

2007-05-07 Thread Morgan Tørvolt

I verified that it is the problem, by watching the BER numbers from czap
go from 0 to 3 digit order when starting playback of anything on the
screen using the nvidia driver.

I have now swapped PC with an older Dell Optiplex with ATI graphichs,
and it is working fine. I didn't manage to get the nv driver to drive my
40 inch Sony HDTV at any higher resolution than 480p ... So that option
was not really an option.

Thanks again for leading me towards an explanation.


Hi. I have a suggestion to possibly help the noise problem.

First, try differene resolutions on your monitor. It should not make a
big difference, but if it does, changing the VGA/DVI cable to DVI/VGA
or to a different of the same type could help out a bit.

If you have a ferrite-ring (sometimes the computer cabinet has one
that all the wires from the front panel leds and buttons goes trough,
like power, reset, hdd and so on), try looping the power cable to the
nvidia kard trough it a couple of times. See this document:
http://www.pearsonelectronics.com/datasheets/technical-literature/Noise%20Suppression%20.pdf

Ferrites are used to reduce high frequency noise, and my guess would
be that exactly that is what is giving the problems.

Using a ferrite on the signal cable from the antenna like in the pdf
document could also possibly improve the quality, although I really
doubt that would help in this case.

If you are close to Oslo, you could drop by Elfa and get one of those
rings. Remember to get one that is big enough for the power cable to
go trough. It is not difficult to disconnect the whires from the plug,
but if it is not nessesary, then why do it?
Homer: If something is hard to do, it's not worth doing

-Morgan-

___
linux-dvb mailing list
linux-dvb@linuxtv.org
http://www.linuxtv.org/cgi-bin/mailman/listinfo/linux-dvb


Re: [linux-dvb] Problem with TT C1500

2007-05-07 Thread Morgan Tørvolt

 First, try differene resolutions on your monitor. It should not make a
 big difference, but if it does, changing the VGA/DVI cable to DVI/VGA
 or to a different of the same type could help out a bit.

I need to run at 1920x1080_50Hz on the DVI (to HDMI) to make full use
of the Monitor. All other resolutions are really just of just academic
interest , and intermediate solutions.


Yes, and that is my point exactly. If the noise comes from the DVI
output circuit, you could possibly improve it by changing to a
different cable. You can check this by just swapping resolutions. This
is just for the academic part of the equations.

As a sidenote, maybe you can even get a better image using VGA cable.
I have that with my projector. It is 720p, but running 720p into the
projector on the HDMI causes it to scale the image so that it cuts the
edges. This because some service providers send out crappy picture
with artifacts along the sides, and Sony decided that it was best to
just do some scaling and remove it. With VGA I get no such scaling,
and a 1:1 pixel ratio.

-Morgan-

___
linux-dvb mailing list
linux-dvb@linuxtv.org
http://www.linuxtv.org/cgi-bin/mailman/listinfo/linux-dvb


Re: [linux-dvb] Problem with TT C1500

2007-05-03 Thread Morgan Tørvolt

You wouldn't by any chance know of any good devices to amplify / clean
DVB signal befor it's fed into the DVB-card ? I tried with an ordinary
antenna amplifier , but that even made my settopbox , which normally
works fine, go bananas too .


If the Nvidia card introduces noise, it probably happens inside the
card and not on the way to it, so this will most probably not help
much. Using an amplifier with a good signal will cause more harm than
good, and should be used if the original signal is not strong enough.

-Morgan-

___
linux-dvb mailing list
linux-dvb@linuxtv.org
http://www.linuxtv.org/cgi-bin/mailman/listinfo/linux-dvb


[linux-dvb] Why is it called BER?

2007-04-18 Thread Morgan Tørvolt

Hi.

I have a question regarding the frontend. You ask for:

ioctl(Fd_frontend,FE_READ_BER, ber_value_goes_here)

As I have not seen the result from this that I would expect from the
documentation here:
http://www.linuxtv.org/docs/dvbapi/DVB_Frontend_API.html, I am sending
this question/request to the mailing list.

Why did you add an R to the end of FE_READ_BE?

Using BER would mean that the ammount is how many bit errors per, say,
second (which is the normal way of looking at it when it comes to
professional satellite equipment at least). Usually one defines BER as
f.ex 3.5 * 10^-6, which means that for every three and a half million
bits, you get one error bit.

From the ioctl call to the frontend with FE_READ_BER though, the BER

seems to be the number of BE since the last call asking for this
value, which makes the R sound a bit wrong in my ears, as it is more a
count of errors than it is a rate of errors.
This could be because of my hardware of course, which is TT-1500S.

To calculate the correct bit error rate before FEC one needs to know
modulation (usually qpsk) and symbol rate. Since QPSK has two bits per
symbol, symbolrate of 30M has 60Mbit data.
When one has this data, you can delete the BE(not R), on the symbol
rate and the ammount of time the BE was accumulated over to get a
correct BER. I guess most people know this, so I say it only to make
sure we are on the same page =)

My advice would be to change the current FE_READ_BER to FE_READ_BEC
(Bit Error Counter), and try to phase out the current definition by
making it obsolete.
I would also like to have a call that can tell you how long since you
last read the counter, so that you can use it to get a correct value
for BER before FEC (using symbolrate of course), and a flag that could
tell you if the counter has reached its maximum value. I have seen it
go to a maximum value (just below 65536 somewhere) and stay fixed
there.

As to BER after FEC, I have no idea how to get hold of that
information currently. Has it something to do with
FE_READ_UNCORRECTED_BLOCKS? I would guess that it would be it. The
counter there seems to be reset upon read there as well. Also,
satellite TV muxes has reed solomon coding. Is the Uncorrected Blocks
the uncorrected FEC blocks or the uncorrected RS blocks? What happens
to the data where there are known bit-errors? Do they get thrown away?

I will hopefully be able to get hold of some professional testing
equipment to make a decent int to decibel table for signal strength
and signal/noise ratio in a while. I wonder if this would be correct
for all hadware, or if this is different from frontend to frontend
though.

Best regards, and thanks for all the good work!

-Morgan-

___
linux-dvb mailing list
linux-dvb@linuxtv.org
http://www.linuxtv.org/cgi-bin/mailman/listinfo/linux-dvb


Re: [linux-dvb] how to full satellite (all transponders) scan?

2007-04-06 Thread Morgan Tørvolt

Nice idea. But i think 1Mhz is a bit heavy. Have seen an approach on a
commercial STB, where you could give the starting frequency and the
distance (f.i. 8khz) to do a scan. Shouldn't a similar approach also
work with scan ?


I've had success locking on a 27500 mux with being as much as 15MHz
off. Going at 5 or 8MHz per step should be more than enough. There was
some talk about an API change for this alng time ago, since some
demods support scanning in some ways, but I think it never ended up in
anything useful (according to this mailing list at least). You need to
scan using different symbol rates though, which makes it a pain. There
are so many of them. One should have a list of regularly used SRs some
place, but I have found none.

-Morgan-

___
linux-dvb mailing list
linux-dvb@linuxtv.org
http://www.linuxtv.org/cgi-bin/mailman/listinfo/linux-dvb


Re: [linux-dvb] Tuner sensitivity - details not on linux dvb wiki?

2007-03-04 Thread Morgan Tørvolt

I thought the MT352 is a demodulator...  isn't the tuner a separate entity??
For example my cards both use a MT352 demod and they also both have a Thomson
7579 tuner so it may be that your devices both had the MT352 but
different tuners... here's what I discovered on the wiki

http://www.linuxtv.org/wiki/index.php/Demodulator

incidentally, was the misbehaving USB stick one of the ones listed between
33-42 in the following list??

http://www.linuxtv.org/wiki/index.php/DVB_USB#Twinhan_DVB-T_USB2.0

Viktor


The reception quality on a card is based on two things, the receiver
and the demodulator. After the demodulator, the signal is digital, so
there is not much more to be done other than running trough error
correction, which will allways be done the same way. On DVB-S, the
tuner is the most important part as virtually all muxes run with the
simple QPSK modulation. On DVB-T the modulation is quite complex. The
fact that a single frequency can be transmittet from different places
with the same data together with all the signals bouncing off
mountains and such gives the demodulator a much harder job than with
DVB-T. How the connection is made between frontend and demodulator can
also affect the signal alot since some ways will introduce more noise
on the way to the demod. How much that last thing affects performance
I cannot tell, but it is very individual for each setup. The
performance of these cheap frontends and demods will also vary quite a
bit, so luck has something to do with the performance also.

That being said, usually this type of equipment will have varying
performance, and as much as 3dB signal sensitivity difference is not
uncommon. Still, a bad version of one could be as good as a good
version of the other. It is impossible to tell for certain. One could
make a database where everyone with at least two tuners could enter
which ones they have and what the performance difference is, but I am
not sure how accurate that will be over time.

An inline variable attenuator is good for signal sensitivity testing
in many cases, but attenuating the signal also attenuates the noise,
so the carrier to noise level stays the same. That must be taken into
account when testing like that. If the difference between two card is
that one gets a better signal quality from a given carrier to noise
level, but the other has a higher gain preamp, the one with the better
preamp will win this test, but is not nessesarily the best card
anyway. Adding an external preamp will give the loser a higher signal
to work with so you get better signal quality, but the same could do
little for the test winner as it already has a strong enough signal
but is unable to take advantage of it. A white noise generator is very
expensive though, so that is in practise the only solution most people
have.

-Morgan-

___
linux-dvb mailing list
linux-dvb@linuxtv.org
http://www.linuxtv.org/cgi-bin/mailman/listinfo/linux-dvb


Re: [linux-dvb] How does bad reception influence quality of macroblocks?

2007-02-16 Thread Morgan Tørvolt

I am a little puzzled as to why the second entry in your example is
showing uncorrected errors and yet has a ber of 0.


Many (at least professional satellite modems) receivers calculate BER
rather than actually measuring them, and the other way around. The
theory is very clear here, and quite good in regards to real life
performance. If one have the Carrier to Noise level, one can calculate
the Eb/No value, which more or less directly translates to bit-error
rate. It goes the other way around also, so if you have the bit-error
rate you can calculate the CNR. (of course you have symbol-rate, fec
and the other parameters for the carrier)
This could cause the receiver to see the signal level as very good,
and calculate a good BER, but still the actual Reed Solomon/FEC error
correction will know that there where some errors there anyway because
of poor syncronization. One should never assume that things are all
good after only a few seconds.

That could be it. Could also very well not be. It could also be that
the BER and UNC reported are not for the same period of time.

___
linux-dvb mailing list
linux-dvb@linuxtv.org
http://www.linuxtv.org/cgi-bin/mailman/listinfo/linux-dvb


Re: [linux-dvb] genlock

2007-01-15 Thread Morgan Tørvolt

Hi

On 15/01/07, Rainer Schubert [EMAIL PROTECTED] wrote:

On Mon, 15 Jan 2007, Samuel Goto wrote:

 Hi everyone,Any

Hi Samuel,

   I am developing a dvb s1 transmitter. Its input is a MPEG TS from a
 mpeg encoder and its output is a DVB S1 stream to the RF satellite.

OK.

   I am having problems synchronizing the input bit rate to the output
 baud rate and I was wondering if anyone could help me out on this list ( or
 suggest me a better one =) ). The transmitter works for 7 to 8 minutes, but
 then its output buffer overflows ( showing that there is a small difference
 from the input rate to the necessary output rate ).

The simplest way, I can think of, is to implement some kind of handshaking
between the transmitter and your transmitting applikation.

You should give the transmitter a chance to say please stop sending data,
until I am ready to receive more. It has been done on telephone modems in
the past.


I think this is different from a modem. A modem does not often
transmit live data. A mpeg encoder does. You cannot easily discard a
part of an mpeg stream eighter. The number of bits required to encode
a picture is very dependent upon the picture, and it is not possible
to determine that exactly in advance. This means that you would
require a large buffer if the bitrate goes too high for a moment, and
this causes several problems. Firstly the PCR does not arrive at the
correct time, reeking havoc with the internal clock of the reciever
(which syncs on this), secondly, the decoder is supposed to get PES
packets from the mpeg stream according to the DTS (decoding timestamp
in the PES stream) which also get alot of jitter in this case. The
result would be that most receivers would probably fail miserably at
decoding the stream.

My approach would be to lower the bitrate of the mpeg encoder to
something that can fit within the transmitted carrier (ex. bandwidth -
100Kbit), then let the transmitting application request packets when
it needs to send one (this makes sure that you get the exact right
output bandwidth). If there is none available in the buffer from the
encoder, it should get a stuffing packet instead. This would slightly
increase the bandwidth of the stream, but would not cause any problems
with eighter PCR, DTS, PTS or any other timestamps.

Regards
-Morgan-



   Thanks for your attention,

   Cya, Sam

Regards,
Rainer
--
Rainer Schubert - Linux TV User
Amateur Radio Call DL6HBO

___
linux-dvb mailing list
linux-dvb@linuxtv.org
http://www.linuxtv.org/cgi-bin/mailman/listinfo/linux-dvb



___
linux-dvb mailing list
linux-dvb@linuxtv.org
http://www.linuxtv.org/cgi-bin/mailman/listinfo/linux-dvb


[linux-dvb] Questing regarding transport_error_indicator

2006-10-21 Thread Morgan Tørvolt

Hi

What would be the common cause of transport_error_indicator to be set?
Is this set in hardware or in the driver?
I seem to get bursts of packets with the indicator set sometimes. At
some channels it comes very regularly, like every 5 second or so. It
is often accompanied by blocking and sound breaking up.

-Morgan-

___
linux-dvb mailing list
linux-dvb@linuxtv.org
http://www.linuxtv.org/cgi-bin/mailman/listinfo/linux-dvb


Re: [linux-dvb] TechnoTrend TT-connect S-2400

2006-10-18 Thread Morgan Tørvolt

Hi

I am also very interested in the state of this driver, if there is a state.

I would be willing to provide hardware (an S-2400
http://shop.technotrend.de/shop.php?mode=show_detaillang=degroup=3sid=30102c812e5e4c7fa7d54e2119db9464s=id=23
) to a couple of developers if anyone wants to get their hands dirty
=) The hardware would be yours to keep (send me a private message).
Too bad laptops doesn't have a PCI slots =(


-Morgan-

On 07/10/06, Jens Krehbiel-Gräther [EMAIL PROTECTED] wrote:

Hi!

Is there no interest in supporting this device?? Or is there nobody out
there who could tell me something about driver state? Is it in
development or will it never be supported under linux?
It is a USB2.0 DVB-S device from TechnoTrend

Output of lsusb -v:

Bus 001 Device 002: ID 0b48:3006 TechnoTrend AG
Device Descriptor:
  bLength18
  bDescriptorType 1
  bcdUSB   2.00
  bDeviceClass0 (Defined at Interface level)
  bDeviceSubClass 0
  bDeviceProtocol 0
  bMaxPacketSize064
  idVendor   0x0b48 TechnoTrend AG
  idProduct  0x3006
  bcdDevice1.00
  iManufacturer   1 TechnoTrend
  iProduct2 TT-USB2.0
  iSerial 0
  bNumConfigurations  1
  Configuration Descriptor:
bLength 9
bDescriptorType 2
wTotalLength  249
bNumInterfaces  1
bConfigurationValue 1
iConfiguration  0
bmAttributes 0xe0
  Self Powered
  Remote Wakeup
MaxPower   10mA
Interface Descriptor:
  bLength 9
  bDescriptorType 4
  bInterfaceNumber0
  bAlternateSetting   0
  bNumEndpoints   3
  bInterfaceClass   255 Vendor Specific Class
  bInterfaceSubClass  0
  bInterfaceProtocol  0
  iInterface  0
  Endpoint Descriptor:
bLength 7
bDescriptorType 5
bEndpointAddress 0x01  EP 1 OUT
bmAttributes2
  Transfer TypeBulk
  Synch Type   None
  Usage Type   Data
wMaxPacketSize 0x0200  1x 512 bytes
bInterval   0
  Endpoint Descriptor:
bLength 7
bDescriptorType 5
bEndpointAddress 0x81  EP 1 IN
bmAttributes2
  Transfer TypeBulk
  Synch Type   None
  Usage Type   Data
wMaxPacketSize 0x0200  1x 512 bytes
bInterval   0
  Endpoint Descriptor:
bLength 7
bDescriptorType 5
bEndpointAddress 0x82  EP 2 IN
bmAttributes1
  Transfer TypeIsochronous
  Synch Type   None
  Usage Type   Data
wMaxPacketSize 0x  1x 0 bytes
bInterval   1
Interface Descriptor:
  bLength 9
  bDescriptorType 4
  bInterfaceNumber0
  bAlternateSetting   1
  bNumEndpoints   3
  bInterfaceClass   255 Vendor Specific Class
  bInterfaceSubClass  0
  bInterfaceProtocol  0
  iInterface  0
  Endpoint Descriptor:
bLength 7
bDescriptorType 5
bEndpointAddress 0x01  EP 1 OUT
bmAttributes2
  Transfer TypeBulk
  Synch Type   None
  Usage Type   Data
wMaxPacketSize 0x0200  1x 512 bytes
bInterval   0
  Endpoint Descriptor:
bLength 7
bDescriptorType 5
bEndpointAddress 0x81  EP 1 IN
bmAttributes2
  Transfer TypeBulk
  Synch Type   None
  Usage Type   Data
wMaxPacketSize 0x0200  1x 512 bytes
bInterval   0
  Endpoint Descriptor:
bLength 7
bDescriptorType 5
bEndpointAddress 0x82  EP 2 IN
bmAttributes1
  Transfer TypeIsochronous
  Synch Type   None
  Usage Type   Data
wMaxPacketSize 0x00bc  1x 188 bytes
bInterval   1
Interface Descriptor:
  bLength 9
  bDescriptorType 4
  bInterfaceNumber0
  bAlternateSetting   2
  bNumEndpoints   3
  bInterfaceClass   255 Vendor Specific Class
  bInterfaceSubClass  0
  bInterfaceProtocol  0
  iInterface  0
  Endpoint Descriptor:
bLength 7
bDescriptorType 5
bEndpointAddress 0x01  EP 1 OUT