-
Notifications
You must be signed in to change notification settings - Fork 558
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PQC in the middle of network, contd. #267
Comments
Thanks for your hint how to use PQC in the middle of a network ! My use-case is quite simple: In your example I don't see how how the tunble_params should be handled. |
When one uses tunable_params = tf.Variable(..., trainable=True)
encoder_params = <output layer from my classical NN>
expectation_layer = tfq.layers.Expectation()
output = expectation_layer(
circuit,
symbol_names=symbols,
symbol_values=tf.conat(encoder_params, tunable_params),
operators=ops) When I concat the values from If things still aren't clear I would suggest reading into the examples using |
Thanks a lot for your clarification ! While the principle how it works is clear, I still struggle with the details. Following your suggestion I developed the following code: import tensorflow as tf
import tensorflow_quantum as tfq
import cirq
from cirq.contrib.svg import SVGCircuit
import sympy
import numpy as np
`#Create` One-Dimensional-Data for Classification
np.random.seed(seed=123)
n = 900
data = np.random.rand(n, 1)
labels=[]
for p in range(0,n):
if data[p] <= 0.5:
label=[1,0]
else:
label=[0,1]
labels.append(label)
labels=np.array(labels)
bit = cirq.GridQubit(0, 0)
symbols = sympy.symbols('alpha, beta')
ops = [cirq.Z(bit)]
circuit = cirq.Circuit(
cirq.X(bit) ** symbols[0],
cirq.X(bit) ** symbols[1],
)
data_input = tf.keras.Input(shape=(1,),dtype=tf.dtypes.float32)
`#Use` a classical NN to transform the data
encod_1=tf.keras.layers.Dense(10, activation=tf.keras.activations.relu)(data_input)
encod_2=tf.keras.layers.Dense(1, activation=tf.keras.activations.softmax)(encod_1)
`#One tunable parameter: Beta in the network
tunable_params = tf.Variable([[np.pi/2]],shape=[1,1],trainable=True)
`#` Concatenating the input angle and the trainable parameter is not straightforward
`#` see: https://stackoverflow.com/questions/36041171/tensorflow-concat-a-variable-sized `placeholder-with-a-vector`
num_rows=tf.shape(encod_2)[0]
tunable_tiled = tf.keras.backend.tile(tunable_params, tf.keras.backend.stack([num_rows, 1]))
params=tf.concat([encod_2, tunable_tiled],axis=0)
expectation_layer = tfq.layers.Expectation()
expectation = expectation_layer(
circuit,
symbol_names=symbols,
symbol_values=params,
operators=ops)
classifier = tf.keras.layers.Dense(2, activation=tf.keras.activations.softmax)
classifier_output = classifier(expectation)
model = tf.keras.Model(inputs=data_input,
outputs=classifier_output)
tf.keras.utils.plot_model(model, show_shapes=True, dpi=70)
print(model.summary()) --> The model looks really weird and the parameter of the QuantumCircuit (beta) is not trainable :-( |
Ok, taking a quick glance at your code. There are several areas I would suggest investigating:
|
Thanks a lot!
Since I come from the Quantum Computing side and not from tensorflow-development for me it's not obvious what to do exactly, so that the custom layer manges the tunable-params for me.
It would be big advantage of QuantumTensorflow (as compared to PennyLane, Qiskit etc) for the Quantum Machine Learning Community if it is possible to combine classical and quantum layers and train them simultaneously in the way I am aiming to - of course scaled to larger networks. At the moment this seems to be prohibitive difficult :-( |
OK. I'm going to do my best to answer your questions and try to help things along. For starters here is a working snippet made from the one you posted above: import tensorflow as tf
import tensorflow_quantum as tfq
import cirq
import sympy
import numpy as np
#Create One-Dimensional-Data for Classification
np.random.seed(seed=123)
n = 900
data = np.random.rand(n, 1)
labels = []
for p in range(0, n):
if data[p] <= 0.5:
label = [1, 0]
else:
label = [0, 1]
labels.append(label)
labels = np.array(labels, dtype=np.int32)
bit = cirq.GridQubit(0, 0)
symbols = sympy.symbols('alpha, beta, gamma')
ops = [cirq.Z(bit)]
circuit = cirq.Circuit(
cirq.X(bit)**symbols[0],
cirq.X(bit)**symbols[1],
cirq.X(bit)**symbols[2]
)
class SplitBackpropQ(tf.keras.layers.Layer):
def __init__(self, upstream_symbols, managed_symbols, managed_init_vals,
operators):
"""Create a layer that splits backprop between several variables.
Args:
upstream_symbols: Python iterable of symbols to bakcprop
through this layer.
managed_symbols: Python iterable of symbols to backprop
into variables managed by this layer.
managed_init_vals: Python iterable of initial values
for managed_symbols.
operators: Python iterable of operators to use for expectation.
"""
super().__init__(SplitBackpropQ)
self.all_symbols = upstream_symbols + managed_symbols
self.upstream_symbols = upstream_symbols
self.managed_symbols = managed_symbols
self.managed_init = managed_init_vals
self.ops = operators
def build(self, input_shape):
self.managed_weights = self.add_weight(
shape=(1, len(self.managed_symbols)),
initializer=tf.constant_initializer(self.managed_init))
def call(self, inputs):
# inputs are: circuit tensor, upstream values
upstream_shape = tf.gather(tf.shape(inputs[0]), 0)
tiled_up_weights = tf.tile(self.managed_weights, [upstream_shape, 1])
joined_params = tf.concat([inputs[1], tiled_up_weights], 1)
return tfq.layers.Expectation()(inputs[0],
operators=ops,
symbol_names=self.all_symbols,
symbol_values=joined_params)
data_input = tf.keras.Input(shape=(1,), dtype=tf.dtypes.float32)
#Use a classical NN to transform the data
encod_1 = tf.keras.layers.Dense(
10, activation=tf.keras.activations.relu)(data_input)
encod_2 = tf.keras.layers.Dense(
1, activation=tf.keras.activations.softmax)(encod_1)
# This is needed because of Note here:
# https://www.tensorflow.org/quantum/api_docs/python/tfq/layers/Expectation
unused = tf.keras.Input(shape=(), dtype=tf.dtypes.string)
expectation = SplitBackpropQ(['alpha'], ['beta', 'gamma'], [np.pi / 2, 0],
ops)([unused, encod_2])
classifier = tf.keras.layers.Dense(2, activation=tf.keras.activations.softmax)
classifier_output = classifier(expectation)
model = tf.keras.Model(inputs=[unused, data_input], outputs=classifier_output)
model.compile(optimizer='Adam', loss='mse')
model.fit([tfq.convert_to_tensor([circuit for _ in range(n)]), data],
labels,
epochs=10)
# Now we can see 37 parameters. Two of which belong to "SplitBackpropQ"
# since we told it above on L81 that we want it to manage ["beta", "gamma"]
model.summary()
print(model.trainable_variables) Let's go over some key points. On line 30 of the snippet, you can see I've made my own little Keras Layer. This layer was made by following the guide here: https://www.tensorflow.org/guide/keras/custom_layers_and_models . In the constructor the On line 80 I have followed the guidance of the large NOTE at the bottom of https://www.tensorflow.org/quantum/api_docs/python/tfq/layers/Expectation . The error you were getting was fixable and was caused by the quantum circuits you supplied not being traceable back to a I see from your error message that you might be using anaconda. If you are, I would recommend switching off of it when working with TFQ. Anaconda uses a different source build of tensorflow than the official pip release version of tensorflow. If you continue to run into errors with the above snippet (which I have tested works on my machine), it could very well be caused by the discrepancy in builds. To fix just pip install tensorflow and pip install tensorflow quantum.
There has been lots of code written in TFQ that has either made it's way into a publication or as examples in TFQ that demonstrate the kinds of hybrid modelling capabilities you are describing. For example we have made a hybridized version of the QCNN (https://www.nature.com/articles/s41567-019-0648-8) as a tutorial: https://www.tensorflow.org/quantum/tutorials/qcnn . Another example would be: https://arxiv.org/abs/1907.05415 , as one of the lead authors on that paper I can speak from experience that these experiments took only several hours to run in a carefully constructed TFQ environment where they would have otherwise taken us days to run using something like Qiskit or PennyLane where we found that the scaling was not nearly as favorable as TFQ. Does this clear things up ? |
Dear Michael, |
Dear all ! Does anybody have an idea what have changed under the hood of tensorflow-quantum and how to change SplitBackpropQ to make it run again ?? Many thanks in advance !!! TypeError Traceback (most recent call last) in init(self, upstream_symbols, managed_symbols, managed_init_vals, operators) ~/miniconda3/lib/python3.8/site-packages/tensorflow/python/training/tracking/base.py in _method_wrapper(self, *args, **kwargs) ~/miniconda3/lib/python3.8/site-packages/keras/engine/base_layer.py in init(self, trainable, name, dtype, dynamic, **kwargs) TypeError: Expected |
It's an error in the class initialization. I don't know what changed in TF (or python), but if you change |
Thanks a lot !!! That works indeed .... The SplitBackPropQ Class was provided by Michael and it worked all the time - until my recent upgrade. Anyway. Issue #672 is indeed related but I don't think this simple solution works in my case.... |
Update version to 0.7.0
Glad to hear you made some progress! I think I get what you're after now so I'll take a stab at it:
If the goal is to have some circuit that encodes data into certain circuit symbols (potentially from a downstream model) that also incorporates learnable symbols that should be trained as the overall model is trained (symbols that are different from the encode symbols) you can do something like this:
Note that I've switched to the
Expectation
layer for more explicit control over what I'm doing with the symbols.With this snippet I'm free to make
encoder_params
come from an input ( like you have in your snippet ) and also keeptunable_params
as some learnable parameter.Was something like this more like what you were after ?
Originally posted by @MichaelBroughton in #249 (comment)
The text was updated successfully, but these errors were encountered: