Guest Ted Grener Posted Saturday at 12:55 AM Report Posted Saturday at 12:55 AM I bought an inverter for 110v to Dc 15 amp for my 20 watt mobile. When I plugged it in it shows 12.9 volts. Will this lower my output as the vehicle outputs 13.8 volt. Quote
BoxCar Posted Saturday at 01:00 AM Report Posted Saturday at 01:00 AM A lot of power supplies have an adjustment for the output voltage that ranges up to 15%. Does your supply have an adjustment? Post the model power supply and where it came from and we may be able to help you find an adjustment screw or give you better advice on what to do. I RoadApple, SteveShannon, Lscott and 2 others 5 Quote
WRHS218 Posted Saturday at 02:12 PM Report Posted Saturday at 02:12 PM This is purely anecdotal: I have a 20w mobile in my vehicle which gets 13.8v (minimum) when the truck is running and 12-12.8v when the truck is not running. While I have not done any serious tests to determine maximum distances, I see no difference in signal strength or clarity at known distances using the radio with the vehicle running or not. The maths would seem to show there wouldn't be much difference between 12.9v and 13.8v. If I remember correctly my 20w radio draws around 8-9 amps when transmitting. I measured it years ago when I thought I really cared. These days, if it works and the magic smoke doesn't leak out, I don't worry about it. WRUU653 and WRXB215 2 Quote
SteveShannon Posted Saturday at 02:25 PM Report Posted Saturday at 02:25 PM 12 minutes ago, WRHS218 said: This is purely anecdotal: I have a 20w mobile in my vehicle which gets 13.8v (minimum) when the truck is running and 12-12.8v when the truck is not running. While I have not done any serious tests to determine maximum distances, I see no difference in signal strength or clarity at known distances using the radio with the vehicle running or not. The maths would seem to show there wouldn't be much difference between 12.9v and 13.8v. If I remember correctly my 20w radio draws around 8-9 amps when transmitting. I measured it years ago when I thought I really cared. These days, if it works and the magic smoke doesn't leak out, I don't worry about it. You’re right. Although a wattmeter might measure an extra few decimal points of outgoing power, it will make zero real difference in range. Quote
Guest WRXI733 Posted Saturday at 06:11 PM Report Posted Saturday at 06:11 PM 17 hours ago, Guest Ted Grener said: I bought an inverter for 110v to Dc 15 amp for my 20 watt mobile. When I plugged it in it shows 12.9 volts. Will this lower my output as the vehicle outputs 13.8 volt. Like said the minute (insignificant) drop in wattage wouldn't be noticed on the receiving end but I would be curious to if it is lower than 12.9 on transmit. 3's Greg Quote
nokones Posted Saturday at 10:42 PM Report Posted Saturday at 10:42 PM Since, I have all my Saturday chores done, I decided to do a quick bench test. I don't have a 20-watt Mobile to test, but I do have a 25-watt Mobile and I highly doubt that 5-watts of RF output power would make a difference in any farz or much of a power draw. I used a Kenwood TK-880-1 UHF 25-Watt Mobile radio, Bird 43 that was recalibrated last year, EMC Corp 150-Watt 50 Ohm Power Terminator (Dummy Load), and a Powerwerx 30-amp variable power supply. My first test was with the voltage regulated at 12.9 volts. The stand-by power draw was .3 amp, the transmit power draw at 6.3 amps, and RF Output at 23 watts. At 13.8 Volts the stand-by power draw was .3 amp, the transmit power draw was at 6.6 amps, and the RF output was 26 watts. The voltage did not drop one iota That is .533 dBm difference in RF power levels and my Farz guesstimate for difference could probably be measured with a short tape measure or maybe a yard stick. SteveShannon, WRXB215, RoadApple and 3 others 4 2 Quote
AdmiralCochrane Posted Sunday at 01:00 AM Report Posted Sunday at 01:00 AM 50 years ago my father's CB did become remarkably clear for a while. Then he noticed he was replacing a lot of bulbs on his car. Luckily he caught it before the battery was cooked, the voltage regulator (old mechanical type) had gone up to about 18 volts. SteveShannon, WRUU653 and RoadApple 3 Quote
WRTC928 Posted Sunday at 02:47 PM Report Posted Sunday at 02:47 PM Logic dictates that there is some point at which lower voltage will result in decreased performance and possibly even damage the radio, but I'm not willing to potentially sacrifice a radio to find out where that point is. For a while, I was using an Anysecu WP9900 plugged into the "cigarette lighter" socket on a jump-start battery pack. It consistently showed 11.8-11.9 volts, but it didn't drop appreciably when transmitting. I didn't test its power output, but I was getting signals out to my favorite repeater with full quieting. I did test it when I had it hard-wired in my truck. It ran 13.8-14.8 volts, and max output was ~19-20 watts (25 watts nominal max power). I now have it attached to a 50 Ah LiFePO4 battery as a backup during storms. It shows 12.3-12.8 volts and tops out at ~18-19 watts. It does seem as if losing 2 volts cost me a watt. Neither I nor the person I'm talking to is likely to notice a difference in signal strength/distance. Quote
dugcyn Posted Sunday at 02:58 PM Report Posted Sunday at 02:58 PM I run a solar system to power my radio's and light my shack etc. I recently received a report that my signal was not as good as usually on our local repeater. did some checking and sure enough I had degraded my voltage by running to much lights etc all day in my shack and voltage was 11.5. next day after fully charging signal report came loud and clear (solar was at 13 volts). this is a 50 wat unit set at mid power (13 watts?). so it seams it does make a difference on my setup. just sharing SteveShannon and WRUU653 2 Quote
nokones Posted Sunday at 04:15 PM Report Posted Sunday at 04:15 PM 1 hour ago, WRTC928 said: Logic dictates that there is some point at which lower voltage will result in decreased performance and possibly even damage the radio, but I'm not willing to potentially sacrifice a radio to find out where that point is. For a while, I was using an Anysecu WP9900 plugged into the "cigarette lighter" socket on a jump-start battery pack. It consistently showed 11.8-11.9 volts, but it didn't drop appreciably when transmitting. I didn't test its power output, but I was getting signals out to my favorite repeater with full quieting. I did test it when I had it hard-wired in my truck. It ran 13.8-14.8 volts, and max output was ~19-20 watts (25 watts nominal max power). I now have it attached to a 50 Ah LiFePO4 battery as a backup during storms. It shows 12.3-12.8 volts and tops out at ~18-19 watts. It does seem as if losing 2 volts cost me a watt. Neither I nor the person I'm talking to is likely to notice a difference in signal strength/distance. The factory specifications of the subject radio should list the voltage operating range. Anything outside of that parameter would be detrimental to the electronics components of the radio. WRYZ926 1 Quote
WRYZ926 Posted 2 hours ago Report Posted 2 hours ago I know the Wouxun KG-1000G and Radioddity DB20-G calls for 11.7 - 15.8 volts. I couldn't find the specs for the Midland MXT500 but it's probably the same. Pretty much all transceivers call for 11.7-15.8 volts. Yes you will see a small drop in output power at the lowest allowed voltage compared to 13.8 volts or the max allowed voltage. But it is not enough to make a difference on signal strength or FARS. I haven't noticed any differences with my QRP HF radios that have internal batteries. They usually will put out 5 watts with the internal battery and 8-10 watts on an external power source, depending on the radio. WRTC928 and SteveShannon 2 Quote
Recommended Posts
Join the conversation
You are posting as a guest. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.