Gradient Descent (VQE)
Layered
qudit and PennyLane use PyTorch autograd — one backward pass per step. Others use parameter-shift —
Loading…
Notes:
qudit (AD)—torch.autogradthroughnn.Module. One backward pass per step regardless of parameter count.pennylane (AD)— same gradient method, higher constant overhead from the QNode abstraction.qiskit (PS)—StatevectorEstimatorbatch API with parameter-shift. Batches all shift evaluations per step.cirq (PS),braket (PS),qutip (PS)— parameter-shift with sequential circuit rebuilds per evaluation. Cost isforward cost per step.