exxos PSU dual voltage inputs (110V & 230V) why i'm not doing it..

Repairs & upgrades.
Post Reply
User avatar
Site Admin
Site Admin
Posts: 24083
Joined: Wed Aug 16, 2017 11:19 pm
Location: UK

exxos PSU dual voltage inputs (110V & 230V) why i'm not doing it..

Post by exxos »

In light of some discussions about dual voltage power supplies, I need to explain why I am not going to be supporting dual voltage power supplies in my designs. It is actually rather complicated to explain why. To explain this all in-depth correctly would probably take 100+ pages. So I will try and explain this as much as possible relating to the main issues with dual voltage systems. I will have to greatly simplify and summarise a lot of things in order to keep the whole explanation to something more manageable. Of course manufacturers have countless datasheets and application notes on everything should aim on want to look into this into more detail. They will of course explain all this whole lot better than I ever could :)

Firstly, all my current power supply designs have dual primary transformers. They can be wired for 110 V or 220 V and that is how they are supplied.

The question people have is why can't I do a power supply would work on both voltages without any extra effort or wiring? Unfortunately this is extremely complex thing to explain but I will give it a try and try and explain things simply has to why this is not a realistic thing to do, at least with low voltage switch mode designs.

A short explanation is that low voltage switch modes have the best efficiency with a 50:50 duty cycle. Basically what this means is that switching MOSFET will be turned on 50% and turn off 50% of the time. This basically means if we were to output 5 V we would need 10 V input to satisfy that condition. With those simple parameters, the switchmode will basically be operating at its most efficient. Of course there are whole epic worth of problems as to why deviating from this basic equation is a problem.

Take for example if we were then to input 20 V the same circuit, the switching MOSFET would have to be more like 25% on, and 75% of the maintain the 5 V output. This by itself isn't generally a problem, but it is fast approaching the region where switch modes that their least efficient. Of course if you do not care about efficiency then this is not so much of the issue.

You could easily end up with 20% loss just in switching losses in the circuit which means the overall efficiency would be 80%. This isn't really too shabby as typically good switch modes fall around 75%-85% efficiency. While this may sound acceptable, if we was switching 100 W load, we would lose 20 W in heat alone. So this puts it into perspective that this is actually rather a lot of wasted power.

My current power supply design was around 80% efficiency is still pretty good, but this can soon heat up the power supply metal plate with even just a few watts of heat loss. So my aim to reduce the heat loss, and do away with needing to bolt the regulators to a heatsink at all. This side effect off greater efficiency means we can run at higher amps output. Yay!

So back to the point.. MOSFETs basically have two types of losses. Switching losses and conduction losses. Switching losses are basically relating to turn on and turn off times with the rise and fall times the switching cycle. For example typical MOSFET could have 50ns turn on and turn off times. Which isn't particularly a lot. The problem is the faster the MOSFET is switching more time spent turning on and off which increases the amount of time of the switching losses.

This is why power supplies suffer poor efficiency at light loads. Basically because the MOSFET may only be turned on for 5% of the time. In terms of losses, the MOSFET could easily spend 80% or more time through its switching region which is a huge loss. You have to think of the MOSFET in terms of a variable resistor which is turning on and off very fast under a load. Of course resistance has losses and you want to keep this loss and time as small as possible.

A similar situation is if we had 50ns on and off times this would mean the MOSFETs maximum frequency would be 10Mhz. But this basically means the MOSFET will be spending 100% of its time in its switching region and will get extremely hot very fast of course is not a good idea. So we run the MOSFET at a more realistic 1Mhz and we reduce the switching losses by a fair amount.

Of course it would be logical to think we could run then even slower to reduce the switching losses, and generally this is correct. But the problem is slow frequencies also mean the MOSFET will be turned on the longer. MOSFETs generally have what is called the on state resistance which is generally around 0.01R - 0.100R. In a lot of cases this could pretty much be ignored but it is never quite that simple.

The problem is if we turn the MOSFET on for 80% of the time, switching losses will be very low, which is of course a good thing, and as the on state resistance is very low is logical to assume that 80% on time should be very good. But the problem is this depends on the output amperage from the power supply and frequency.

When the MOSFET is turned on, it is basically charging the inductor and the longer the MOSFET is turned on, the higher the amperage will be fed into inductor. Amperage will keep on rising the longer the MOSFET is turned on. And as the MOSFET has internal resistance, this will have a voltage drop, and the larger the voltage drop, the greater the losses will be. Generally once we get to around 10 A on the MOSFET, we are in a region where the voltage drop across the MOSFET is getting rather high and it will start to get extremely hot once again.

Also if we take into account the inductor itself, the more amperage which is flowing through it, the hotter it is going to get which means the losses will be higher. The inductors of course have a resistance, which means they will have a voltage drop depending on the current amperage through the inductor. So the higher the amperage, the higher the voltage drop, and the hotter it is going to get. So it would actually be more desirable keep the current through the inductor as low as possible.

This would also mean we would have to increase the frequency in order to keep the same amperage throughput. While we can ramp the frequency to even megahertz range, we take the losses away from the inductor and start introducing them into the MOSFET instead. As previously mentioned, the faster the MOSFET is switching, the more losses it is generally going to have.

So to round up a little, switching a MOSFET on the short amount of time is actually very inefficient in particular with light loads is documented in a lot of switchmode design datasheets. On the other hand, turning the MOSFET on for a longer time then results in other losses which also generates heat. The the best compromise we can come to to turn it on and off at 50% or as near as possible. This is my aim my current power supply designs. Also the charge inductor only needs to have a very small number of turns, which means we can keep the resistance extremely low increases efficiency no end as I have already documented.

Of course it is not just about losses either. If we was driving 5 V at a 2A load and the MOSFET was turned on 80% of the time, this leaves is very little headroom to increase the amperage output. Will most likely hit 95% on-time with a 3 A load, and basically you're at a point where the switch mode circuit can no longer sustain the load and it will start to hic-up and malfunction. So it is more logical to keep the on-time the more realistic 50% during a typical load of around 2amps.

There are of course even more ways that this can get complicated. As we could find a MOSFET which switch faster than 50ns to reduce the switching losses. Some MOSFETs can switch at as little as 5ns. awesome isn't it ? Well yes, and no. The problem is faster switching MOSFETs have a smaller die size which is why they can switch faster. A smaller die size also translates to a higher on state resistance. So while we could reduce the switching losses by huge amount, we would also increase the on state resistance by huge amount. So a typical situation of making something good in one respect, is making it worse in another! So again, a compromise has to be made between the two.

Also on the other side of this, we could also decrease the on state resistance which again is a trap a lot of people fall into, basically because while the on state resistance can be nearly zero, the time it takes to switch on and off is basically horrendous so the switching losses will go through the roof. We must also take into account that the larger die sizes are slower and are only really good for motor controls not really any use for high-speed switching circuits.

The best overall compromise we could have is to run for example, 110V input at a 25% on state, and a 220V input with a on state of 75%. This basically misses our 50% on state target which is the most efficient point. While we could run the power supply with these dual voltages it does have a huge drawbacks particularly with efficiency.

It would also mean that the overall output current would also be limited. If we was running at 75% on-time already, the power supply could simply not deliver more current before it maxes out the cycle. This actually would not be a problem with the 25% on-time cycle, as the more the power supply was loaded more efficient it would become as it approached the 50% on-time region. During a typical load this simply wouldn't happen anyway.

On the reverse of this, if we suddenly halved the amperage output from say 2Amp to 1Amp, our on-time would plummet to probably just a few percent. Again, efficiency at rock bottom around that region. So course this is not ideal either.

So what I do is take a typical figure of 2amps typical load and make the power supply run at 50% on and off times which makes it the most efficient point. Of course the drawback for this is that if we moved to 1 amp or 3 A, the figure will alter and efficiency will start to suffer, probably isn't going to be a drastic amount anyway. But the point is the power supply can cope with this rather easily and can actually pump up to around 6amps the new design. If we went for dual voltage inputs, we simply would end up being limited to around 3 A does of the wide range of input voltages.

There's also the issue that a lot of the switchmode chips have a very limited input voltage range. While this may seem bad, it is actually not really relevant as previously mentioned, simply having a wide input voltage range is basically kills good design. So with all this in mind is far better just to have a transformer and simply alter the voltages on the primary to fit either 110V or 220V inputs. Of course people could solder on a voltage selector switch if they were in that dire need of it.

There are of course so many topologies and designs for power supplies it would take years to go through them all. Its not only the overall design of the PSU, for example, as people know, some Atari PSU's can run 90-250VAC input, though they are pushing to even get to 50% efficiency, that is why they get very hot! Though cost is also a factor as well. Its always cheaper to try and use off the shelf parts. If I went for a high voltage switchmode like the original STF PSU's, then I likely would have to get a custom made transformer to be the spec I want. Then we are back to £50+ per transformer and huge investment costs on my part. I also don't really want to re-create something which is basically the same as the original PSU's. I'd rather create something new and go with modern low voltage switchmode solutions.

So now that everyone is sufficiently confused about all this, basically proves the point that it is not so simple to design the power supply which can run at two separate input voltages efficiently. While I agree with everyone that it would be useful thing to have, I'm not prepared to sacrifice huge amounts of efficiency or capable output amperages to do it. With costs in the mix also, not only development but production costs, I need to keep all as low as possible. This also means the final product value is something more sane.

There is also research costs, I have a lot of experiences in low voltage switching designs, so while it may take a year to do a new design, that's actually pretty fast! If I moved away from "known" designs, I would have to start over from the ground up and its just not viable to spend years designing going down some other route for basically no good reason.

So I hope this gives a insight to "why" my PSU's are designed the way they are. They make look simple with few parts, but the work and costs which go into it all are astronomical, and of course comprises have to be made throughout.
https://www.exxosforum.co.uk/atari/ All my hardware guides - mods - games - STOS
https://www.exxosforum.co.uk/atari/store2/ - All my hardware mods for sale - Please help support by making a purchase.
viewtopic.php?f=17&t=1585 Have you done the Mandatory Fixes ?
Just because a lot of people agree on something, doesn't make it a fact. ~exxos ~
People should find solutions to problems, not find problems with solutions.
Post Reply

Return to “PSU (power supplies)”