I continued with the next exercise on the QML track, the task was to create a VQC (Variational Quantum Classifier) to classify Ising-spin systems into ordered and disordered states, with an accuracy of at least 90%.
The Ising model, is a simplified model of a ferromagnet, describing it’s magnetic properties. A ferromagnet displays spontaneous magnetization (i.e. parallel alignment of it’s atoms spins) below its critical temperature. In our dataset, the ferromagnet is represented by five spins arranged in a linear setup.
Variational Quantum Classifier
A Variational Quantum Classifier differs in two essential points from a conventional NN classifier. Firstly, the input data needs to be encoded into qubits. Secondly, the hypothesis function is a parameterized quantum circuit. These parameters are subsequently optimized using classical optimizers.
Implementation
My implementation heavily draws on the PennyLane VQC tutorial, I made a few adaption though.
First we need to create our hypothesis, similarly to the tutorial I chose strongly entangling layers. I only want to distinguish two classes, and I do not want to create a quantum-classical hybrid pipeline (apart from the classically added bias term). Therefore, I only need the expectation value of a single qubit, it doesn’t matter which one I choose.
@qml.qnode(dev)
def circuit(weights, x):
"""VQC architecture, encodes input data and performs a series of strongly entangling layers, to perform the classification.
Args:
- weights (np.ndarray): weights for the entangling layers
- x (np.ndarray): spin configurations
Returns:
- prediction (float): expectation value of one qubit, after the entangling layers
"""
qml.BasisState(x, wires=range(num_wires))
qml.StronglyEntanglingLayers(weights=weights, wires=range(4))
return qml.expval(qml.PauliZ(0))
def variational_classifier(weights, bias, x):
return circuit(weights, x) + bias
Next, I initialize my parameters and my optimizer. I opted for three layers, and a smaller step size.
np.random.seed(0)
n_layers = 3
shape = qml.StronglyEntanglingLayers.shape(n_layers=n_layers, n_wires=num_wires)
weights = np.random.random(size=shape)
bias = np.array(0.0, requires_grad=True)
opt = NesterovMomentumOptimizer(0.1)
batch_size = 10
Finally, the optimization loop …
for it in range(25):
# Update the weights by one optimizer step
batch_index = np.random.randint(0, len(ising_configs), (batch_size,))
X_batch = ising_configs[batch_index]
Y_batch = labels[batch_index]
weights, bias, _, _ = opt.step(cost, weights, bias, X_batch, Y_batch)
# Compute accuracy
predictions = [np.sign(variational_classifier(weights, bias, x)) for x in ising_configs]
acc = accuracy(labels, predictions)
print(
"Iter: {:5d} | Cost: {:0.7f} | Accuracy: {:0.7f} ".format(
it + 1, cost(weights, bias, ising_configs, labels), acc
)
)
I reach an accuracy of 93.6%, a good result for this toy problem.