LQR controller tuning in a closed loop system problem

12 Ansichten (letzte 30 Tage)
Junhwi
Junhwi am 18 Mär. 2023
Kommentiert: Sam Chak am 20 Mär. 2023
Hi, I made some code for the LQR controller in a closed loop to compare open loop system in a discrete time system.
But the result is not what i expected that state trajectories in a closed loop is worse than the open loop system.
And one state in the closed loop is not converging fast enough.
Please check my code. I think it's a tuning problem.
% LQR DC motor control
% State space model of a DC motor
Ra= 1; %Armature electric resistance[Ohm]
La= 0.5; %Electric inductance[H]
Ki= 0.023; %Motor torque constant= Electromotive force constant[Nm/A]
Jm= 0.01; %Moment of inertia of the rotor[kgm^2]
Kb= 0.023;
Bm= 0.00003;
A=[-Ra/La -Kb/La; Ki/Jm -Bm/Jm ];
B=[1/La; 0];
C=[0 1];
D=0;
Ts= 0.2; %Sampling time
Np= 10; %Prediction horizon
[n,d]=ss2tf(A,B,C,D); %Denominator and Nominator of Transfer function
Ct = tf(n,d); %Continous tf
Cs = ss(tf(n,d)); %Continous ss
Dt = c2d(tf(n,d), Ts,'zoh'); %Discretize tf with sampling time Ts
Ds= c2d(Cs, Ts); %Discretize ss
%% define the desired state
xd=[0; 0];
% compute ud(desired control input)
ud=-inv((Ds.B)'*(Ds.B))*(Ds.B)'*(Ds.A)*xd;
%% simulate the system response for the computed ud
% set initial condition
x0=1*randn(2,1);
%final simulation time
tFinal=100;
time_total=0:Ts:tFinal;
input_ud=ud*ones(size(time_total));
[output_ud_only,time_ud_only,state_ud_only] = lsim(Ds,input_ud,time_total,x0);
% figure(1)
% plot(time_total,state_ud_only(:,1),'k')
% hold on
% plot(time_total,state_ud_only(:,2),'r')
% xlabel('Time [s]')
% ylabel('states')
% legend('i_{a}','\omega_{m}')
% grid
%% Design LQR controller
Q = diag([0,300]); % State weighting matrix
%Q = 1*Ds.C'*Ds.C; %Bryson's Rule to define your initial weighted matrices Q
R = 0.01; % Input weighting matrix
% Compute LQR gain matrix K
[K,S,e] = dlqr(Ds.A, Ds.B, Q,R,0);
% eigenvalues of the closed loop system:AS since eig in unit circle
[T,D] = eig(Ds.A-Ds.B*K);
% eigenvalues of the open-loop system %stable
eig(Ds.A);
%simulate the closed loop system
%closed loop matrix
Acl= Ds.A-Ds.B*K;
lqrClosedLoop=ss(Acl, Ds.B, Ds.C,Ds.D);
closed_loop_input=0.2*ones(size(time_total));
%
%figure(2)
%step(lqrClosedLoop)
[output_closed_loop,time_closed_loop,state_closed_loop] = lsim(lqrClosedLoop,closed_loop_input,time_total,x0);
figure(3)
plot(time_total,state_ud_only(:,1),'k')
hold on
plot(time_total,state_ud_only(:,2),'r')
xlabel('Time [s]')
ylabel('states')
legend('i_{a}','\omega_{m}')
plot(time_closed_loop,state_closed_loop(:,1),'m')
hold on
plot(time_closed_loop,state_closed_loop(:,2),'b')
legend('i_{a}','\omega_{m}','i_{a}-Closed Loop ','\omega{m}-Closed Loop')
grid

Akzeptierte Antwort

Sam Chak
Sam Chak am 19 Mär. 2023
The LQR in this example is designed in continuous-time. Can you obtain the discretized control system?
% LQR DC motor control
% State space model of a DC motor
Ra = 1; % Armature electric resistance[Ohm]
La = 0.5; % Electric inductance[H]
Ki = 0.023; % Motor torque constant= Electromotive force constant[Nm/A]
Jm = 0.01; % Moment of inertia of the rotor[kgm^2]
Kb = 0.023;
Bm = 0.00003;
A = [-Ra/La -Kb/La; Ki/Jm -Bm/Jm ];
B = [1/La; 0];
C = [0 1]; % Output is state 2 (so tune q2 in Q)
D = 0;
Ts = 1; % Desired settling time
q2 = 1e2; % Tune this value only
Q = diag([1, q2]);
R = 1;
K = lqr(A, B, Q, R)
K = 1×2
3.9932 9.9705
sys = ss(A-B*K, B, C, D);
N = 1/dcgain(sys);
Gcl = ss(A-B*K, N*B, C, D) % closed-loop system
Gcl = A = x1 x2 x1 -9.986 -19.99 x2 2.3 -0.003 B = u1 x1 20 x2 0 C = x1 x2 y1 0 1 D = u1 y1 0 Continuous-time state-space model.
tau = Ts*3*2;
[u, t] = gensig('square', tau, 2*tau, 0.01);
lsim(Gcl, u, t), ylim([-1 2]), grid on
stepinfo(Gcl)
ans = struct with fields:
RiseTime: 0.3306 TransientTime: 0.8615 SettlingTime: 0.8615 SettlingMin: 0.9003 SettlingMax: 1.0327 Overshoot: 3.2714 Undershoot: 0 Peak: 1.0327 PeakTime: 0.6823
  4 Kommentare
Junhwi
Junhwi am 20 Mär. 2023
Thank you very much!
Sam Chak
Sam Chak am 20 Mär. 2023
Glad to hear that, @Junhwi. If you find the solution is helpful, please consider accepting ✔ and voting 👍 the Answer. Thanks a bunch! 🙏

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Produkte

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by