Forge Transactions

As we discussed earlier, transactions are the smallest activities happened on the Forge backed chains and the code backing the transactions is called transaction protocol. Transaction protocol to Forge transaction is just like smart contract to Ethereum transaction.

By default, Forge ships with a set of core transaction protocols - each protocol covers a set of typical use cases. Application developers could decide to install all these protocols or just pick protocols that they want to support. For example, an application may decide no normal asset creation is allowed in its chain, so it installs core protocols without create_asset / update_asset protocols. Application can build and install their own protocols as well if they feel that existing protocols cannot fit for their needs.

Categories

For core transaction protocols we grouped them into the follow categories.

Base

This is the most basic transaction protocols that a chain must install. Right now it contains one transaction protocol: core, which provides all the basic functions for state creation / update. Note this transaction protocol is subject to change.

Account

Account related transaction protocols. Including:

Asset

Asset related transaction protocols. Including -

Basic asset creation and manipulation:

Advanced asset creation and exchange:

Trade

Governance

Document will be exposed soon.

Stake

Document will be exposed soon.

How to write a transaction protocol

To write a new transaction protocol you basically need to prepare the following files:

  • config.yml or config.json: configuration for this transaction protocol. Used by forge-compiler to know which files to compile and what metadata to include in the protocol.
  • a proto file (optional): the protobuf definition for your transaction protocol. Must be referenced in config.yml.
  • a pipeline file (optional): the transaction pipeline for your transaction protocol. Must be referenced in config.yml.
  • a set of source code: the logic of the transaction protocol.

Once you write a new tx protocol, you can either compile it directly with forge-compiler:

$ forge-compiler config.yml
1

or add it into Makefile so that make build-protocols will take care it for you.

The compiled result is a url based64 encoded (padding: false) deploy protocol itx. For your local node, you can use a moderator's wallet to send it to the chain.

config.yml

An example of config.yml looks like this:

---
name: consume_asset
version: 0
namespace: CoreTx
description: Consume an asset that is owned by self
type_urls:
  fg:t:consume_asset: ForgeAbi.ConsumeAssetTx
proto: protocol.proto
pipeline: protocol.yml
sources:
- protocol.ex
1
2
3
4
5
6
7
8
9
10
11

type_urls are a map of type urls (key is the type_url and value is the module name) that you'd register to ForgeAbi. The type_urls mentioned here will be used by ForgeAbi.encode_any / ForgeAbi.decode_any.

version must be monotonically increasing. Forge will refuse to install a tx protocol with an old version.

protocol source code

Normally you just need a single file if your protocol is not too complex. The file structure looks like this:

defmodule CoreTx.ProtocolName do
  # RPC helper function for sdk to use
  defmodule Rpc do
    import ForgeSdk.Tx.Builder, only: [tx: 1]
    tx :protocol_name
  end


  # customized pipe for Check pipeline
  defmodule CheckTx do
    use ForgePipe.Builder
    def init(opts), do: opts

    def call(info, _opts) do
      info
    end
  end

  # customized pipe for Verify pipeline
  defmodule CheckTx do
    use ForgePipe.Builder
    def init(opts), do: opts

    def call(info, _opts) do
      info
    end
  end

  # customized pipe for Update pipeline
  defmodule UpdateTx do
    use ForgePipe.Builder

    def init(opts), do: opts

    def call(info, _opts) do
      info
    end
  end
end
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39

If your code logic is too complex to put into one file, you can use multiple source files, just remember to reference them into the config.yml.

Guide on protocol upgrade

Protobuf message model ensures that if we're following certain rules, we can upgrade the message with backward-compatible changes. These rules are:

  • Existing fields should not be renumbered. If you have already used string name = 1 you can not do string name = 2 later on.
  • Existing fields should not have their types changed. You can't change BigSint value = 2 to BigUint value = 2. Exceptions:
    1. regular field can be upgraded to an oneof
    2. a regular field can be upgraded to a repeated field.
  • New fields should not reuse any previously assigned field number.
  • Enum defaults should be picked such that they make sense looking forward, or be set to UNSPECIFIED.
Last Updated: 5/5/2019, 2:55:04 PM