I have some inexpensive security cameras in my house to watch my Dad
who has Alzheimers. They are German made, wireless, color with audio,
day night, and they use the 2.5Ghz bandwidth which subjects them to
interference . My house is in a bad area for wireless, and range is
non existent beyond 50 feet, and interference is constant. But for my
needs they work great. The cameras have their own DC adaptors but
will also run for around six hours off a nine volt battery. They come
with a little adaptor for this purpose. I suppose you could hook this
up temporarily if you wanted to check on something outside,
overnight, although I haven't tried this.
One of the adaptors went out and I went to replace it. they state 8
volts DC 200ma and this one was only putting out around 6 volts which
meant no picture. I have a lot of old adaptors in my junk box, but 8
volts is an odd amount so I hooked a 9 volt 500ma adaptor and the
camera works better then before with no interference. Obviously, more
power equals more range, which makes me question why they used 8
volts to begin with?
I would not of attempted the 9 volt upgrade except that they have it
set up to run with a 9 volt battery. With such good results I am
considering upgrading all the cameras. Is the nine volt adaptor
putting the camera at risk, could it blow out the IR LEDs or
something?... can any EE in the group give me some advice?
thanks!
- [H] too much voltage? Winterlight
-