FAQ | Feb 17,2023
W/V=A
When connecting a transformer to an electrical power source, you need to calculate the current it will draw through the primary. You should then hook the transformer up to a circuit breaker of an equal or higher current rating so that the breaker will not trip under normal operation of the transformer. The current will depend on two factors: the voltage of the power source to which you connect the transformer and the amount of power in watts that it will consume. Both factors are part of the transformer design.
Find out the voltage rating of the transformer you are hooking up. If it goes to a home circuit, it will either be 120 or 240 volts. Check the specifications to be sure you have the right transformer for your project.
Find out the wattage rating of the transformer.Divide the wattage by the voltage. For example, if you have a 300-watt lighting transformer and you are going to hook it up to a standard 120-volt socket, divide 300 by 120. The transformer will draw 2.5 amps of current. Most home wall sockets have circuit breakers of 15 amps, so this particular transformer will not draw enough current to be a problem.
Use the same formula for all transformers. If for some reason you need a larger transformer to operate appliances, you still divide the wattage by the voltage to find the current. For a 120-volt primary, 2000-watt transformer, divide 2000 by 120 for the current (2000 Watts /120 volts =16.67 amps). For a 240-volt, 3000-watt transformer, the current is 12.5 amps.
--- END ---