20th of February 2024 — Ryan Soury — [email protected]


For the past year, Usher Labs has been deeply invested in tackling the challenge of achieving verifiable transparency, culminating in the development of the Transparency (T) Node & Network. This initiative represents an advancement from their publicly accessible MVP, the Log Store Network, and introduces a sophisticated solution to ensure transparency across blockchain applications.

Image 2: Depiction of the Transparency Node & Network architecture

Image 2: Depiction of the Transparency Node & Network architecture

The T Node & Network by Usher Labs establishes verifiable transparency through the use of provenance and integrity proofs, anchored in the Elliptic Curve Digital Signature Algorithm (ECDSA) primitive. Furthermore, it utilises authenticity proofs—essential for shedding light on real-world and Web2 data—through the Multi-Party Computation Transport Layer Security (MPC-TLS) primitive. The importance of transparency is becoming increasingly acknowledged within blockchain contexts, especially when integrating custom off-chain or real-world data.

Functioning within centralised systems, the T Node acts as a multifaceted entity: a database, an HTTP request prover, and a proxy interface. It incorporates a version of the open-source notarisation protocol (MPC-TLS) in conjunction with a high-frequency, peer-to-peer publish/subscribe messaging protocol. The Node collaborates seamlessly with the Network to stream cryptographic attestations concerning diverse data points. Here, the Network serves dual roles: as a data availability layer and a notary for generating authenticity proofs. This arrangement guarantees the secure, redundant, and fault-tolerant storage of cryptographic attestations, thereby validating the provenance and integrity of both first and third-party signals, metrics, and messages, as well as the authenticity of data derived from TLS-enabled HTTP endpoints. Such data encompasses sensitive financial information and Know Your Customer (KYC) details, along with other data accessible over the TLS-secured internet.

Designed to facilitate access to on-chain capital for off-chain purposes, this infrastructure aims to function with the simplicity of a Web2 service. It enables the sharing of crucial metrics and attestations regarding off-chain data and processes with capital allocators to mitigate risk. Moreover, it incorporates proofs into transactions submitted to a blockchain for verification purposes.

While this technology is capable of authenticating data, it prompts a further discussion on the concept of "trusted data".

“Trusted Data” as a Heuristic

Real-world assets (RWAs) represent a convergence of contemporary technology and the legal framework with blockchain systems. Although a verifiable data pipeline can be established, invariably, within the data supply chain, reliance on a trusted source is often inevitable. Traditional oracle networks have mitigated this issue by requiring consensus among multiple providers regarding specific data points. However, the challenge persists for niche, private, or custom datasets, where the trustworthiness of data remains a critical concern.

To navigate this, a series of logical assumptions can be formulated to assess the trustworthiness of data provided by a service entity. These assumptions are predicated on the following conditions:

  1. Primary Purpose and Data Provenance: Is the entity's main function to provide data, and is the data sourced directly (first-party)? Entities fitting this criterion include government-backed APIs or private companies specialising in industry-specific data aggregation and delivery.
  2. Value Correlation: Is the entity’s data validity and accuracy directly correlated with the value of their service? This condition differentiates entities prioritising data volume for broad insights from those offering high-quality data crucial for sensitive operations, such as those in financial markets.
  3. Capitalisation and Incentives: Is the entity sufficiently capitalised such that the financial incentive for maintaining data accuracy surpasses the temptation to falsify data? This consideration effectively excludes entities that serve as both trusted data providers and RWA issuers, where there is no mechanism to demonstrate non-collusion between these roles.

Although it is unlikely that many trusted data sources will meet all three conditions, this framework facilitates human-in-the-loop decision-making regarding the risk profile of a verifiable data pipeline and its designated "trusted" data sources. Within an RWA protocol, community-led governance could play a pivotal role in endorsing specific data sources as "trusted" providers, thereby reinforcing the integrity and reliability of the data underpinning the protocol.

Applied Zero-Knowledge Technology

In the quest to bolster our transparency infrastructure, Usher Labs acknowledges the critical need to maintain privacy for certain data points during on-chain verification processes. This encompasses a wide range of sensitive information, including individual identities, private contractual details, and legal structures. To address this, Usher Labs is pioneering the development of a zkOracle Framework, an innovation built upon the T Node & Network foundation. This bespoke framework is designed to facilitate the creation of application-specific zkOracles. These oracles are capable of generating zero-knowledge proofs (zkProofs) that validate cryptographic assertions for data authenticity, provenance, and integrity, all while preserving the confidentiality of the underlying data.

Our strategy involves harnessing a proven ZK proof system, complemented by advanced tools such as zkLLVM or zkVM.