The Graph: A Project With Some Promise (Part One)


8 min read
The Graph: A Project With Some Promise (Part One)

The Graph, which advertises itself as a provider of "APIs for a vibrant decentralized future" stands out as a plausible blockchain project on its face.

However, the only way to find out for sure is by examining the project's core features and plans.

Basic Premise of the Project

One thing that sticks out as interesting about this project is its use of GraphQL.

Specifically, the 'The Graph' labels themselves as "A Global GraphQL API" on their website, defining this further in stating:

"Subgraphs can be composed into a global graph of all the world's public information. This data can be tarnsformed, organized and shared across applications for anyone to query with just a few keystrokes."

The idea that this project will process the entire planet's store of public information is a bit grandiose, but there is undoubtedly a use for a blockchain-based API...if it is implemented correctly.

More on GraphQL

GraphQL itself is a tool used to craft queries in JSON format to send to servers for manual polling of API endpoints.

[1]

The GraphQL website defines it as:

"A query language for APIs and a runtime for fulfilling those queries with your existing data. GraphQL provides a complete and understandable description of the data in your API, gives clients the power to ask for exactly what they need and nothing more, makes it easier to evolve APIs over time, and enables powerful developer tools."

The main benefits of GraphQL in a nutshell are that it is:

  1. Widely deployed
  2. Widely available
  3. Easy to use (if you're someone that's used to using APIs for a variety of reasons)

Purpose of This Tool

Generally, this tool is used to extract information from an API endpoint to either be stored or piped into an application designed to create graphical representations of the data like Kibana, Redash, or Metabase.

Dune Analytics

If you're looking for a live example of how API data like this can be used, then look no further than 'Dune Analytics' (this may actually be how a project like 'The Graph' would be leveraged).

Below is quick look at the site to give users a better idea of how 'The Graph' could work in theory (we'll have to evaluate the technical underpinnings of this project to ensure that they've implemented this in the way that they propose):

[2]

Redash, in specific, curates these graphs via SQL queries.

Here's a quick breakdown of how this works in laymen's terms:

  1. See that graph that we showed above? That graph is created through a program called Redash.
  2. In order for graphs like that to be created, the program needs to pull the data from somewhere. Redash (and other programs like it), will pull that data from a database backend.
  3. Database backends are typically used because many databases essentially store data in a manner similar to a CSV / Excel spreadsheet.
  4. "Querying data" from a database is a fancy way of asking the database for certain data.

    Imagine you have the following:

A typical database query would look something like:

select column2 FROM table1 WHERE Name='Sally';

Which would extract the following information:

The chart above shows how the example SQL query we created would yield a response of "Freshman". If we translate this to a query of blockchain data, we can craft (a more complex) query that tracks the total traded volume of Uniswap between two predefined dates. Below is an example of what the screen looks like for Redash (the program that pulls data for the graphs using SQL queries)

Integrating GraphQL into the Workflow

Now let's get to GraphQL. If you're wondering how that data that we pulled from the database backend got there in the first place, look no further than a reliable API tool like GraphQL! (and possibly a pubsub - but we'll get to that later)

Below is a picture of how GraphQL may be used to funnel information into a database backend:

graphql-engine/live-queries.md at master · hasura/graphql-engine · GitHub -  IChannel Technologies

[3]

There are various additional tools (like Hasura.io) that have been created to facilitate the direct querying of SQL backends directly through GraphQL queries (this is used as a means of translating SQL queries for load balancing purposes)

graphql-to-sql
graphql-to-sql-multiplexed

Circling Back to 'The Graph'

Before we dive into the actual merit of the project itself, let's take a look at some entities that are backing 'The Graph'.

The reason why we're taking the time to look at the VCs that have invested in 'The Graph' is because they are pretty notable names.

In specific, Venture Capital firms that have invested in 'The Graph' are:

  • Digital Currency Group (x2; they participated in two rounds - which is interesting)
  • Multicoin Capital (x2)
  • Coinbase Ventures
  • Framework Ventures
  • DTC Capital
  • Lemniscap (also invested in projects like Kava.io)
  • South Park Commons
  • Kilowatt Capital LLC

[4]

The stature of these firms in the digital currency space goes without saying.

Below is a graph displaying all Digital Currency Group's investments alone:

img

Their portfolio currently holds >100+ companies in the blockchain space (many of them recognizable by name for most readers, undoubtedly).

What This Means

To put it conservatively? There's a near guarantee that this project will be in the space for the foreseeable future (post-launch).

Perhaps if this project had received investments from entities like Binance, NGC Ventures or others over on that side of the pond, there'd be reason for trepidation by investors since those aforementioned entities have developed a reputation of investing in and launching "pump and dump" projects.

However, the firms that are involved in 'The Graph' typically look to milk their projects over the long run (i.e., Chainlink, Aave, Matic, etc.)

And despite the initial parabolic growth, there is typically a 'second wave' that occurs at some point with these investments after they have been launched.

Looking at the Technical Merits of the Project Itself

For this portion of the report, we're going to refer to the project's documentation, located on their site here. [5]

Here we find a more comprehensive description of what 'The Graph' is aiming to do, which is also re-published below for convenience:

"The Graph is a decentralized protocol for indexing and querying data from blockchains, starting with Ethereum. It makes it possible to query data that is difficult to query directly."

There is some validity to the claim that the data on blockchains can be difficult to query directly since there are different schemas, modes of operation, etc.

However, in the project's nascent phases, there is no reason for us to expect that it will be comparatively more useful than anything else currently existing since it will begin by querying the Ethereum blockchain (there are many tools that propose to do this already ; not to mention the project's actual JSON RPC specification that can be used as well).

According to the documentation, it solves the issue of parsing & mitigating 'Uncle Blocks', 'chain reorganizations, etc., by:

[Providing] a hosted service that indexes blockchain data. These indexes ('subgraphs') can then be queried with a standard GraphQL API. In the future, the hosted service will evolve into a fully decentralized protocol with the same capabilities. Both are backed by the open source implementation of Graph Node.

That Was Some Negative News

I'm referring to the part where the documentation notes that this will be a "hosted service" at the onset, with plans for it to eventually be 'a fully decentralized protocol' at some point in the future (post-launch).

That's a problem for many reasons:

  • Every other API solution for Ethereum (for which there are a lot), can do exactly what 'The Graph' will be able to do out of the gate.
  • Following from #1, it's plausible to suggest that 'Dune Analytics' represents a steep upgrade above what 'The Graph' will be able to grant (as 'The Graph' does not stipulate the transformation of said data into an SQL / database backend or that it will be piped into a graphical interface like Redash, Metabase, or Kibana)
  • What makes the solution more mundane is that it seems like it only complicates the process that we described above for recreating an application like Dune Analytics.

The documentation states that:

"The Graph learns what and how to index Ethereum data based on subgraph descriptions, known as the subgraph manifest. The subgraph description defines the smart contracts of interest for a subgraph, the events in those contracts to pay attention to, and how to map event data to data that The Graph will store in its database."
"Once you have written a subgraph manifest, you use the Graph CLI to store the definition in IPFS and tell the hosted service to start indexing data for that subgraph."

The inherent benefit of using 'The Graph' is still vague (or possibly non-existent), unless the project is promising to manifest these features for end users for absolutely zero cost (which isn't entirely impossible since 'IPFS' was mentioned; but if that is the case, then that begs the question of what the value added of this protocol would be).

Revising Some of Our Initial Assumptions

Moving further into the documentation, the following pipeline is proposed:

  1. "A decentralized application adds data to Ethereum through a transaction on a smart contract."
  2. "The smart contract emits one or more events while processing the transaction."
  3. "Graph Node continually scans Ethereum for new blocks and the data for your subgraph they may contain."
  4. "Graph Node finds Ethereum events for your subgraph in these blocks and runs the mapping handlers you provided. The mapping is a WASM module that creates or updates the data entities that Graph Node stores responding to Ethereum events."
  5. "The decentralized application queries the Graph Node for data indexed from the blockchain, using the node's GraphQL endpoint. The Graph Node in turn translates the GraphQL queries into queries for  its underlying data store to fetch this data, making use of the store's indexing capabilities."
  6. "The decentralized application displays this data in a rich UI for end-users, which they used to issue new transactions on Ethereum."
  7. "The cycle repeats."

It appears that the process above is describing a means of anchoring / mapping finite data on the blockchain to some other arbitrary data that's been assigned by the user.

And that the result (mapped data) is being stored on the blockchain in a format that can later be reliably queried through this tool.

Again, this is merely an assumption at this point.

It is equally possible that merely querying data via 'The Graph' triggers some sort of smart contract execution (almost like IFTTT on the blockchain).

The usefulness of this idea, in itself, is limited (in the author's opinion (Hi!)).

We're nearing two thousand words on this piece, so we'll pick back up here in the documentation soon for a part two.

References

[1]: https://graphql.org

[2]: https://explore.duneanalytics.com/dashboard/dex-metrics

[3]: https://ichanneltech.com/graphql-engine-live-queries-md-at-master-·-hasura-graphql-engine-·-github/

[4]: https://www.crunchbase.com/organization/the-graph/company_financials#investors

[5]: https://thegraph.com/docs/introduction

GO TOP