On the word of a Librehash community member, a project that we should be on the "lookout" for is one called 'parsiq'.
Being entirely unfamiliar with the project or its general premise, this brief project report is being curated from a completely neutral perspective.
Starting With the Site
As always, we're going to start with the project's main website, which can be accessed at 'https://www.parsiq.net' (at the time of writing).
Upon visiting the front page, we were greeted with the following:
First impression is a bit of exasperation.
There is a well-known protocol called, 'JSON RPC' that allows users to extract infinite amounts of information from the blockchain already.
In addition, there are other competitors in the blockchain space that believe that their API solution is more novel than the last projects' proposed implementation.
Digging a Bit Deeper into the Site
Before we can make any valid conclusions on this project or what it is that they're doing, we're going to have to extract more information first.
Going back to the site, its worth documenting that their headline states that they are able to, "Turn blockchain data into action."
This is followed by a byline iterating the following:
"Connect blockchain activity to off-chain apps and devices. Monitor and secure DeFi applications; build custom event triggers and power real-time automations."
So upon closer inspection, it appears that this project is attempting to slide in the niche that 'The Graph' created with their project.
Making Our Way Down the Site
Scrolling a bit further down (past the offer of a free demo as well as the "try for free" button [which implies that this service ]), we were met with the following interactivel loading graphic:
It can at least be stated that their uesr interface looks nice. But on the basis of what we have seen thus far, this appears to be a non-product of sorts.
Reason for the Harsh Appraisal Thus Far
The harsh reality is that this project is attempting to sell something that is alerady:
Or if not widely available, extremely easy to build
Example Setup of a Near-Identical Service Offering
Let's take a detour over to a site called, 'bitquery' (bitquery.com)
Here they are pictured below:
As we can see in the screenshot above, not only do they provide blockchain data for a plethora of chains (more on this in a second), they also have this data formatted in such a way to where someone can parse the on-chain data and make meaningful extractions with relative ease.
Further on down the page it is stated that bitquery is:
"A set of software products that parse, index and store blockchain data in a unified way. We started with bloxy.info as an analytics explorer and APIs for Ethereum Mainnet. With Bitquery, we are crossing the 'chain chasm'by delivering a set of products that can work across blockchains and offer Market Analytics, Money Flow, DEX Trading, Decentralized Finance, and Scientific Reseach."
The best part about all of what was written above is that we do not need to wait for a demo or pay any money to begin leveraging the wealthy stores of data afforded to us by Bitquery.
We just need to visit their 'Bitquery Explorer as depicted below:
Quick Look at the Bitquery Explorer
A screenshot of the explorer, live at Bitquery can be seen below:
As promised, each and every single project that readers see represented above can have meaningful data (of any sort) extracted from it.
The full list (at the time of writing) includes:
- Bitcoin Cash
- Bitcoin SV
- Ethereum Classic
- Algorand Mainnet
- Binance Smart Chain Mainnet
- Celo Mainnet
- EOS Mainnet
- Tron Mainnet
As well as nearly all of the relevant testnets.
If we click on the panel labeled, 'UTXO Based Blockchains', for instance, we're presented with the following:
Observant users may have noticed that there are buttons in the bottom corner of each chart labeled, 'JS' and 'GraphQL'.
Trying Out the Litecoin GraphQL Data
Below is a sample of just how easy it would be for us to curate data on any and every parameter offered by the Litecoin network - at absolutely no cost.
Attaching the Data to a Pipeline
Once the appropriate pipeline (via GraphQL or some other free, open source tool) has been acquired and can be polled via API requests the next step would be to acquire a 'streaming' platform.
One of the most popular tools to use for this activity is called, 'Apache Kafka', which works via a system of "brokers" and "clients".
Clients, "allow you to write distributed applications and microservices that read, write and process streams of events in parallel, at scale, in a fault-tolerant manner even in the case of network problems or machine failures. Kafka ships with some such clients included, which are augmented by dozens of clients provided by the Kafka community: clients are available for Java and Scala including the higher-level Kafka Streams library." - source
"Producers are those client applications that publish (write) events to Kafka, and consumers are those that subscrie to (read and process) these events."
Kafka, along with 'zookeeper' (another module that's used in unison with Apache Kafka), help to facilitate this "streaming" of data from an API source over to a designated target.
As the front page states:
"Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications."
Attaching the Streamed Data to a Source to Consume it and Implement Some Sort of Finite Action
Yet another free app in existence that would allow users to do exactly what is described in the title for this section can be deployed in this situation.
This one is called 'mercure.rocks' (that's the URL for their webpage as well).
The general description for how the PWA works is shown below:
Below is a graphic from the site that provides a rudimentary display for how the app actually functions:
As we can see above, data is streamed from a source (i.e., the 'app server'; which would be Kafka in our scenario), then extracted into a pipeline where actions are performed on the basis of the information that has been extracted.
Below are a list of potential uses for the 'mercure.rocks' tool:
Moving further into the website, there are a wealth of examples that show us the functionality of the pubsub capabilities of the app (again, this is an open source tool that can be set up & used with relatively low overhead).
Even though the app in question is a chat app example, the principle still remains:
A) Data is generated from somewhere (akin to how Parsiq claims that they leverage data from the blockchain).
B) It is then piped into the chat application example that we have created
C) An implicit action is taken in the form of ensuring that the other party in the room (ourselves once again) are able to remain privvy to the communications taking place in the room at the time)
There is obviously more than can be done from this point, but this exemplifies the ease with which data can be piped into apps such as the one that we're discussing here (see below):
Panning Back to the Parsiq Website
Let's take a look at some of the proposed integrations on the main site (i.e., the actions that can be triggered via data gleaned from the blockchain[s]):
So, as we can see the workflows that are presented are essentially what we imagined that they would be when initially appraising the idea.
Therefore, to be fair, we should use a more cogent example of how this entire pipeline process could be duplicated by using GraphQL against blockchain data (provided by Bitquery) before eventually piping said data into an open source integration app like n8n.
What is 'n8n'?
n8n is an open source tool that essentially mimics the functionality of IFTTT or Zapier.
From their website we can see the following:
Below shows an example workflow that users are able to create with this tool:
As if to drill in the point even further, there is a live API gif on the main website that shows the ease at which one would be able to glean information / data from an API source (i.e., from GraphQL).
Librehash Has a n8n-enabled Website
By coincidence, this is one of the tools that Librehash has set up for access by members (within the members-only App Portal).
One Major Potential Use Case Being Missed
One astronomical use case for these types of projects that appears to be seriously underutilized is the principle of using these types of projects (in the form of a blockchain; not a token solely) to pull back / validate data using JWT tokens. The tokens could also be distributed in a manifested format (i.e., via the ERC20 standard that Ethereum uses).
This could allow for the distribution of encrypted secrets, keys, etc.
One major potential use of such a setup could be the delivery of keys (encrypted) providing access to another wallet (or perhaps to the current wallet in question that a user is attempting to spend funds from as a form of intermediate / stand-in 2FA).
JWT is dictated entirely by the formation of JSON structured data and is an IETF standardized protocol.
The data structure takes, at most, 100 bytes or so (which means that the overhead for transaction data would be nominal) in addition to it being stateless.
Below is an example of a JWT encoded token (with an example signature provided as well):
The link to the relevant playground can be found here: https://jwt-playground.palvikas5.vercel.app/?value=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c
Here is another link for those that wish to create their own JWTs: https://mkjwk.org/ (see below for a live example)
Notably, in the photo above, we can see that one of the compatible algorithms with the JWT / JWK /JWKS / JWE standard is secp256k1, which is the elliptic curve that Bitcoin, Ethereum, Litecoin, Bitcoin Cash, Bitcoin SV, and thousands of other cryptocurrencies use in the formation of their public and private key pair (which one can imagine would only lead to infinite additional possibilities ; especially within the context of a project whose premise is to trigger events on the basis of data on the blockchain).
Accounting For Potential Mistakes in Implementation
Since certain events are triggered on the basis of things that happen on the blockchain, there must be special precautions taken to ensure that an appropriate amount of time has elapsed before initiating whatever the corresponding option is supposed to be (if the triggered action is one that could have undesirable consequences when inappropriately / inaccurately executed).
Graphics like the following (from the main page) make this thought worthy of heavy consideration when assessing the difficulties involved with coordinating chained actions based on updates from the blockchain:
Does Parsiq Run a Full Node?
If not, then this solution could become downright dangerous.
Due to Sybil Attack prevalence on the Bitcoin mainnet (and perhaps many other blockchains), attackers could issue a time warp denial of service attack that provides a different / invalid version of the chain to the node in question.
The reason why this attack would be effective against an SPV node is due to the fact that they do not validate transactions directly (rather they rely on full nodes on the blockchain to do so ; the primary purpose of a SPV client is to allow for vendors / service providers to refer to the blockchain in order to administer services without being forced to deal with the burden of running an entire full node for this one niche service.
There is so much more to get into here and those things will be addressed (that's a promise); but at 2k words, we're now pushing up against the outer bound of what is reasonable for one article that covers a particular project.
In the next part, we're going to look at the inherent competition that this project faces from 'The Graph' (especially since the latter project is using established technologies vs. the in-house / homegrown built solution that Parsiq provides).
In addition, The Graph's services appear to be completely free (vs. the priced option provided on the front page for 'Parsiq')