Dear All,

I have a question regarding convergence.

I am attempting see the effect of removing a node on a power network using 
MATPOWER.

To do this, I first evaluate the IEEE case with 300 nodes, and then I set the 
branches of one of the nodes to zero. Doing this typically causes the network 
to no longer converge using the power flow equations, however in some cases it 
will still converge. In an attempt to make the diverging networks converge 
again, I systemically reduce the load by setting the bus Pd and Qd values 5% 
lower and then run the convergence test again.

Is there a better way to do this? Am I actually decreasing the load? And will 
decreasing the load make the equations more likely to converge?

Also, how can I tell what is making the equations not converge? Is divergence 
only caused by a load imbalance where the load is too high?

Thanks in advance

Best Regards,
Mitchell T Dennis

Reply via email to