Hey there, fellow developers and builders! We’ve got something exciting to share with you. Ever heard of Substreams? If you’re into blockchain, this is something you’ll want to know about. Let’s dive in.
Graph Builders Office Hours mission
The Graph Builders Office Hours aims to empower developers and teams working with The Graph protocol by focusing on the technical aspects of building decentralized apps.
What’s the deal with Substreams?
Traditionally, the Graph Node was the main player, getting its data feed from a stream of blockchain blocks via the gRPC protocol. This data was then transformed into entities and stored in a Postgres database.
Enter Substreams. They’re changing the game by moving this transformation process from the Graph Node to the Firehose infrastructure. So, instead of working on the subgraph, developers like us now craft a Substreams module that runs on Firehose infrastructure. It processes the blocks, extracts the entities, and sends them over to the Graph Node.
Why should you care?
- Efficiency: Substreams handles data way better. We can use cool languages like Rust for development. Code generated by Rust compiler is extremely efficient, memory-safe, and fast. That means fewer hiccups and better memory security. Once processed by Substreams workers, developers can consume the data multiple times without reprocessing the blocks. Check out some Rust examples for Substreams.
- Flexibility: Substreams are extremely scalable. Depending on your billing plan, you can make up to a hundred Firehose workers process your Substreams module on the Pinax backend, achieving phenomenal processing speeds.
- Simplicity: Subgraph development is now cleaner and more straightforward since Substreams handles all the heavy lifting. Substreams is moving data transformations and aggregations to the powerful backend infrastructure, minimizing the load on the Graph Node.
- Adaptability: Substreams are designed to be modular and extensible. You can compose new Substreams on top of others, avoiding duplicate work. This means that as blockchain technology evolves, Substreams can easily adapt to new changes and requirements without major overhauls.
- Consuming Substreams with Substreams sinks: One of the standout features of Substreams is the sinks. In addition to using Substreams in subgraphs, you can forward their outputs into dozens of existing sinks or build your own! Want to pipe your Substreams data into a SQL database? Not a problem—use SQL sink. Key-value storage? Easy! Sink your smart contract updates into a Discord channel or Telegram group? That’s possible too. There are more sinks like Google Sheets sink, Clickhouse sink, MongoDB sink, CSV file sink, Parquet sink, and others. Check out the Pinax sink library.
Substreams in action
In the video, Yaro shows us how it’s done:
- Kick things off: Using the Substreams CLI, he starts a new Substreams project. Please view video above for details. [Video 7:37 – 10:12]
- Craft a module: This defines how blockchain data changes. Yaro makes one that handles transfer events from an ERC20 smart contract. [Video 10:15 – 31:49]
- Test it out: Before going live, he tests the Substreams locally with Docker. Explore more about Substreams browser. [Video 32:00 – 37:25]
- Launch: The Substreams is then sent off to the studio to start indexing data. [Video 37:28 – 39:38]
What’s next for Substreams?
While Substreams is still pretty new, it’s set to change how we process and index blockchain data. As the technology evolves, we’re looking forward to more features, better guides, and an even smoother experience for developers. Dive deeper with this awesome list of Substreams resources.
Our chat with Yaro opened our eyes to the potential of Substreams in the blockchain world. For all the devs out there, this is something to watch. We’re on the edge of our seats to see where Substreams will take us next!