Zero-Knowledge private machine learning on Bitcoin


This post was first published on Medium.

Previously, we have demonstrated running a fully fledged deep neural network on Bitcoin, where both the input and the model of the machine learning (ML) algorithm are public. In practice, it is often desirable to keep the input or model off chain and thus private, while still ensuring the ML algorithm is run faithfully. We achieve this by applying Zero-Knowledge Proof (ZKP) to ML.

Zero-knowledge on-chain machine learning

Zero-knowledge on-chain machine learning

There are two categories of private information when it comes to ML.

Private input

The input to the model is hidden, but the model itself is public. This is particularly useful for applications involving sensitive and private data such as financial records, biometric data (e.g., fingerprint, face), medical records, and location information. For example, one can prove he is over 21 years old without disclosing his age. Or an insurance company uses a credit score model for loan approvement. The model is made public for transparency, but the inputs, such as an applicant’s salary and bank statements, should be kept confidential.

Private model

The input to the model is public, but the model itself is private, often because it is intellectual property. For instance, we use a tumor classification model owned by a private company to detect tumors from images. The model is certified to have 99% accuracy when classifying a public dataset. The company can just publish the cryptographic commitment of its model, i.e., hash of all model parameters. We can be sure the model is legitimate, while not seeing it. The cryptographic commitment also ensures the same model is applied to everyone, for fairness. This is desired in, e.g., an admission model which ranks candidates based on their public information.

ZKP is a natural fit for retaining privacy when using on-chain ML, because it can hide information off-chain, while proving ML inference is correct.

Classifying Handwritten Digits

As a demonstration, we have implemented a simple model for the classification of handwritten digits. The model was trained using labeled examples from the MNIST dataset. The architecture of the model is very similar to the one we used for our fully on-chain model.

ZK Circuits diagram

We use ZoKrates to build ZK circuits, which can make any inputs private trivially, by simply declaring it using keyword private.

Private input

From the above code, we can see that the inputs of the model, model_inputs, are passed as a private parameter, meanwhile the model parameters (weights and biases) are public. Once we pass the input to the model, the circuit performs all the model operations on the data and outputs the models prediction/ class.

Private model

The following is the code for making the model private.

Here instead of passing the models input data, we pass the models parameters themselves as private. Using these secret parameters, the circuit performs all the necessary operations of the model and compares the results against a batch of test examples. If the model reaches a certain classification accuracy (CA) threshold, the execution succeeds.

The full code of both the first scenario and the second scenario can be found on GitHub.

Summary

We have demonstrated how we can leverage the ZK property of zk-SNARKS for machine learning on chain. This allows us to hide specific parts of the ML computation.

References

https://0xparc.org/blog/zk-mnist

width=”560″ height=”315″ frameborder=”0″ allowfullscreen=”allowfullscreen”>

Watch: The BSV Global Blockchain Convention panel, Blockchain for Digital Transformation of Nations

width=”560″ height=”315″ frameborder=”0″ allowfullscreen=”allowfullscreen”>

New to Bitcoin? Check out CoinGeek’s Bitcoin for Beginners section, the ultimate resource guide to learn more about Bitcoin—as originally envisioned by Satoshi Nakamoto—and blockchain.



Source link

Previous articleiPhone 14 review: Is good enough good enough?
Next articleApple expands iPhone production in India in shift away from China