Why generation is 11kv in the power stations ?

  Designing any component in power system be it a transformer or a generator or any other component has basically two constraints, conductor size (copper) and insulation. Conductor size being dependent on current level, insulation depends on the voltage level of the device.

So for a particular power rating, it is the job of the engineer to decide a trade off between the two factors, thereby minimizing cost and losses.

P= root (3) V x I x Cos(theta) is valid for a 3-phase network. Now for minimizing losses across the line (which is proportional to I^2*R) it is always intended to minimize the current by transmitting the power at a higher voltage (220 kV, 400 kV, 765 kV etc) as I is inversely proportional to V for the same power.

So one must be thinking that it is wise to generate power at very high voltage (220 kV) rather than stepping it up after generating at a low voltage (11 kV, 22kV, 25 kV etc). After all, we will be using the same amount of insulator in the transformer itself.

One thing we must keep in mind that transformer is a static device and alternator is rotating device (may be at a speed of 3000 rpm for turbo generators). So installing that much amount of insulator may not only make the device bulkier but also imbalanced for normal operation.

It is cheaper to generate at a relative lower voltage and then step it up for transmission. Hence, most power generating plants are designed to operate at 11KV. 

To generate at 33KV, the size of the motor might be twice as large as the size of 11KV generator. So it is better to have a multi stage step up for transmission if need be.

One plausible explanation I have encountered is that the apparent standardized voltage values of 3.3kV, 6.6kV, 11kV, etc are not necessarily, or intended to be, multiples of 11. Any fortuitous relationship to form factor is purely coincidental. 

These values have their basis in formative development years of the Electrical Supply Industry when the adoption of the more rounded figures of 3KV, 6kV, and 10kV was made. 

During this era, it was pessimistically assumed that the transmission lines would subtract around 10% of the input voltage level through power losses in the cable. 

Therefore in order to compensate for this, the primary generation voltage would be; the required nominal voltage + transmission losses. E.g. 3000v + 300v = 3.3kVetc. and hence the off load generated voltages became 3, 6, 11 + 10%.

Most Asked Electrical Interview Questions:

Previous Post Next Post