ZK MapReduce Examples

Lagrange Labs' SDK is designed to enable developers to trustlessly compose cross-chain states into their DApps, using basic MapReduce interfaces. The SDK can support both the generation of in-browser proofs for small scale computations and distribute proof generation on our remote backend for larger scale proofs.

Lagrange Labs’ storage proofs are highly efficient and aggregatable at scale, enabling large proof constructions that are otherwise untenable with competitive solutions. For distribute remote proof generation, computation is parallelizable so proof generation time only scales in O(log(n)).

Below we’ll breakdown 3 examples of proof generation.

  1. A simple single storage slot proof done in browser.

  2. A distributed remote proof using a simple average over 1000 blocks.

  3. A distributed remote proof using multiple successive MapReduce steps to compute a moving average of price for a DEX while removing outliers.

  4. A distributed remote proof that merges multi-chain states into one dataframe.

A simple 1 block account age proof:

An example of a toy app showing your friends how old your Eth wallet is.

var oldest_block;

var lagrangeJS = require('LagrangeJS');
java
const ethereum = lagrangeJS.configure({
    chain: "Ethereum",
    provider: polygon_provider.getSigner()
})

//Storage proof for a single block
var dataframe = ethereum.createDataFrame({
    ,
    block_range_end: oldest_block + 1,
    interval: 1,
    content: {
        address:"0xb794f5ea0ba39494ce839613fffba74279579268"
    }
})

//Generate ZKMR Proof
dataframe_Merged.generateProof({
    location:"local",
    max_threads:1

})

The average liquidity for a DEX over a certain interval:

Consider a situation where a DeFi app that relies on a liquidation mechanism (lending protocols, options protocols, derivatives protocols, etc) wants to compute the average liquidity in the DEX that it liquidates to over a period of time.

In the below example, the average is taken over 1000 blocks (e.g. 1000 storage proofs).

var lagrangeJS = require('LagrangeJS');

const polygon = lagrangeJS.configure({
    chain: "polygon",
    provider: polygon_provider.getSigner()
})

//Storage proofs for 1000 consecutive blocks

var dataframe = polygon.createDataFrame({
    block_range_start:polygon.currentBlock() - 1000,
    block_range_end: polygon.currentBlock(),
    interval: 1,
    content: {
        address:"0xb794f5ea0ba39494ce839613fffba74279579268",
        keys:[
        {name: liquidity_1, memory:"0x56e81...21"},
        ]
    }
})

//Compute Average Liquidity for the DEX (a reduce step)
average_liquidity = polygon.reduce(avg("liquidity_1"))

//Generate ZKMR Proof
average_liquidity.generateProof({
    location:"remote",
    max_threads:1000

})

A TWAP for a DEX that excludes outliers by using multiple MapReduce steps:

Consider a situation where a DeFi app wants compute a time-weighted average price for a low liquidity or volatile asset on a DEX. In order to avoid price manipulation, the DeFi app wants to filter out outliers that are +/= 2 standard deviations from the mean.

In the below example, the average price is computed by proving the liquidity of each asset every block over a 1024 block window (e.g. 2048 storage proofs).

var lagrangeJS = require('LagrangeJS');

const polygon = lagrangeJS.configure({
    chain: "polygon",
    provider: polygon_provider.getSigner()
})

//Create a dataframe of storage proofs for 1000 consecutive blocks

var dataframe = polygon.createDataFrame({
    block_range_start:polygon.currentBlock() - 1024,
    block_range_end: polygon.currentBlock(),
    interval: 1,
    content: {
        block_hash:"0xc0f4...e6",
        address:"0xb794f5ea0ba39494ce839613fffba74279579268",
        keys:[
        {name: liquidity_1, memory:"0x56e81...21"},
        {name: liquidity_2, memory:"0x56211...32"}
        ]
    }
})

//Compute block-by-block price for each asset pair (a map step)
dataframe = polygon.map("asset_price",
                           polygon.liquidity_1.div(polygon.liquidity_2))

//Compute Standard Deviation (a reduce step)
var price_mean = polygon.reduce(avg("asset_price"))

//Compute Mean (a reduce step)
var price_std = polygon.reduce(stddev("asset_price"))

//Filter outliers using standard deviation
dataframe = polygon.filter(polygon.asset_price.lt(mean - 2 * price_std) ||
                          polygon.asset_price.gt(mean + 2 * price_std))

//Compute mean without outliers
var outlier_resistant_mean = polygon.reduce(avg("asset_price"))


//Generate ZKMR Proof
dataframe_Merged.generateProof({
    location:"remote",
    max_threads:2048

})

//Submit proof to chain
lagrange.submitProof({
    chain: "ethereum",
    provider: rpc_provider,
    publicStatement: dataframe_Merged.publicStatement,
    proof: dataframe_Merged.proof
})

Joining Multi-Chain Data Frames:

Consider a situation where a DeFi app wants merge multi-chain states into one dataframe to compute an asset price that is averaged across two different DEXs on two different chains.

dataframe_Poly = dataframe_Poly.map("asset_price",
                           dataframe_Poly.liquidity_1.div(dataframe_Poly.liquidity_2))


dataframe_Opt = dataframe_Opt.map("asset_price",
                           dataframe_Opt.liquidity_1.div(dataframe_Opt.liquidity_2))

//Union Polygon and Optimism dataframes and compute mean price

var dataframe_Merged - dataframe_Poly.union(dataframe_Opt)

dataframe_Merged.reduce(avg("asset_price"))

//Generate ZKMR Proof
dataframe_Merged.generateProof({
    location:"remote",
    max_threads:2048

})

//Submit proof to chain
lagrange.submitProof({
    chain: "ethereum",
    provider: rpc_provider,
    publicStatement: dataframe_Merged.publicStatement,
    proof: dataframe_Merged.proof
})

Last updated