§ Blog

§ Scaling Ethereum One Tx At A Time

§ Mar 1, 2021

With regards to building rugpullindex.com, there are currently two problems bugging me. One is that gas prices on the main net are insanely high right now. And two is that the Ethereum front end space has become even more hostile than what I was used to before.

In a recent post over on my blog, I've made the argument that "Ethereum isn't fun anymore" and that "web3 is a stupid idea". Though I've earned some criticism for these posts, I'd now like to double down. I have an alternative vision for web3. Purely from a pragmatic, architectural point of view.

I've written it in long-form over in the Ethereum/EIPs issue section already. We need to start thinking practically about light clients now. Ironically, full nodes are costing developers real dollars today. And building truly decentralized applications is hardly possible anymore without a credit card—the irony.

I guess nobody designed the Ethereum protocol with light clients in mind. Still, I think there are small fixes, applied here and there, that could help dramatically improving user experiences in web3's front ends. So what's the plan?

§ Scaling Web3 via WebRTC

Just recently, WebRTC was made a W3C and IETF standard. WebRTC (or Web Real-Time Communication) is a concept for sharing data directly between users' web browsers without going through middlemen like servers. "Over the past year, WebRTC has seen a 100X increase of usage in Chrome due to increased video calling from within the browser.", the article states. But WebRTC cannot only be used for distributing video. Reasonably, we can use it to spread any data. And one data that I've ranted about not being distributed well enough is that of the Ethereum blockchain.

§ ... and Via WebTorrents

WebTorrents allow us to download torrent files directly from the web. instant.io, for example, enables a user to paste in a magnet link to download it within the browser instantly. A client could now easily send a magnet link to start syndicating files.

In general, torrents have a rather bad reputation, mainly as they've been a driver of piracy in the past. However, speaking of their technical properties, torrents are like one of the coolest technologies around.

So how does WebRTC, WebTorrents and Web3 fit together?

§ ethtorrent, A Decentralized Web3 Provider

WebTorrent utilizes WebRTC in browser environments. It can fall back into a webtorrent-hybrid for server-side usage. What's fantastic is that WebTorrent has a distributed hash table built-in. It even allows specifying a custom hash function. So what's the plan?

For now, the plan is to democratize the access of blockchain data for regular web3 apps again. The first step towards this will be creating a lean component that we can use with web3.js. Its goal is to cache and store all requests from web3.js that have to do with a full transaction or an entire block. We will await the response, cached for these requests, and offer it for download on WebTorrent via a custom DHT.

If a second client comes along, for each request they make towards the full node's RPC endpoint, it will be interrupted, and instead, will consult the WebTorrent's DHT first. In case the retrieval of a transaction is possible via torrents, it will make no RPC endpoint call. That is good for a few reasons:

§ What Needs to Happen Now?

I'm not sure if I'll handle this project as part of rugpullindex.com. However, only through it, I had the idea for it. In any case, I think building the project shouldn't be too much of a hassle as WebTorrent comes with batteries included. As a start, I'll attempt to create a library that can bootstrap the Ethereum WebTorrent network for sharing transactions and blocks. Then, I'll build a simple bootstrapping node capable of talking back to an archive node for eventually missing transactions or blocks in the DHT.

Then, I think it's a question of whether the idea is accepted and used by the Ethereum community. However, a web3 provider could significantly reduce the number of requests a dapp does daily; I could imagine there be a will to give it a try.

And that's how I want to contribute to scaling Ethereum for now. I hope you enjoyed reading. Feel free to let me know your thoughts by reaching out to me. My email is on my blog.

That's all for today.

Best, Tim

§ Feb 24, 2021

§ The Algorithm; It's Working!

§ Feb 23, 2021

On Feb 18, 2021, the maintainer of the "Oceancap - Datapool Evaluation and Charting" (ADASTA-60)" data set tweeted:

1.) We decided to close our Oceancap pool on 21/02 due to the market situation. We are pretty sure that @oceanprotocol is working hard on preparing an updated Marketplace in the near future. We are waiting on the sidelines and take a break for now.

— Oceancap - Datapool Evaluation and Charting (@OCharting) February 18, 2021

Since rugpullindex.com listed ADASTA-60 in its TOP 25 index for a while, I was curious how the ranking algorithm would react to the announcement. Remember, the algorithm ranks a data set based on its market's performance. It works "autonomously" and isn't capable of comprehending the statement—instead, it's rating each data set by its market's performance. Our thesis is that if a data set's market is strong, its value is high too and vice versa.

Here is ADASTA-60's performance within the context of the announcement:

Date Score Gini Liquidity (OCEAN) Price (OCEAN)
2021-02-17T23:01:04.300Z 0.62 0.98 38969 20.80
2021-02-18T23:01:03.913Z 0.60 0.98 37664 19.60
2021-02-19T23:01:03.956Z 0.55 0.98 37549 19.47
2021-02-20T23:01:04.073Z 0.57 0.98 37549 19.47
2021-02-21T23:03:56.286Z 0.45 0.97 21411 11.10
2021-02-22T23:01:03.474Z 0.22 0.96 9148 4.74

Looking at the market's data, we can see the following:

§ Conclusion

rugpullindex.com's initial thesis that markets are a proxy for data sets has found some evidence in this particular case. rugpullindex.com successfully forecasted an investment risk (Gini-Index close to 1) before it manifested itself. Its algorithm is now automatically decreasing ADASTA-60's stake as the market reacts to the announcement.

I find this result excited as it's the first time we can see the collected data and my work in action. 🥳

In the future, I want users to gain the same insights I was able to acquire today. I'm excited to continue working on that.

Best, Tim

§ Feb 17, 2021

§ Feb 16, 2021

§ Feb 15, 2021

As announced on Feb 12, 2021, liquidity and price are now displayed in EUR. However, EUR values are not yet used within the ranking algorithm.

§ Feb 14, 2021

Midnight: After months, I made some changes to the crawler again which lead the page to be down the last two nights. The reason was a bug in the price crawler.

I was trying to get OCEAN's current EUR price and I was using Coingecko's historical API, that didn't send back any results (because it's "historical" and not "present" time). The crawler is now using Coingecko's simple API to get the price.

A few reflections on what I learned by having to open my laptop before breakfast and before going to bed on a Saturday:

Working on a website that always displays new information is fun. I check rugpullindex.com myself daily. I like the feeling of gardening the website. But soon I want to find ways to improve upon the above mentioned issues. It may just be a matter of improving the crawler's tests.

Best, Tim

§ Feb 12, 2021

Today marks an important day in the life of rugpullindex.com and OCEAN. When I was trying to compartmentalize the crawler's myriad subqueries, I noticed that, as intended, all data sets are normalized based on the all-time highest liquidity a data set pool reached.

What I had neglected was that I used OCEAN as the unit of liquidity. It makes no sense, though, as the goal is to compare any data set relative to the all-time best performing data set. With a fluctuating token, however, this may not work well.

Consider the data set QUICRA-0 that had 499,296 OCEAN in its pool yesterday—assuming that OCEAN/EUR traded at 0.5 EUR yesterday, QUICRA-0 had roughly 250,000 EUR liquidity in its pool. Now, consider that today the price of OCEAN increased by another 0.5 EUR to 1 EUR. But no change has occurred in QUICRA-0's liquidity pool. It means that while the number of OCEANs backing QUICRA-0 didn't change, its performance increased as the price of OCEAN doubled. Compared to yesterday QUICRA-0 is doing 2x as good!

Hence, I plan to measure a data set's liquidity now in fiat or specifically EUR. I've already finished the adjustment of the crawler. I wasn't able to finish integrating the change into the UI. But once the update is live, I'll inform you about it in detail.

Best, Tim

§ Feb 8, 2021

👋 Today marks the first day that I'm "getting paid" for working on rugpullindex.com. It's because I came in seventh place in OceanDAO's round 2 of grant proposals and was rewarded 10k OCEAN. My original plan was to use the DAO's grant as a freelance budget to work on rugpullindex.com properly. Hence, I swapped them to USDC.

Having a stable supply of digital currency now means I can "invoice" rugpullindex for the work I'm doing. It's really just a fancy way of doing accounting. There's no official company or anything. Still, it's a big step as it means that I'm now able to justify spending time on the project during "my working hours."

And it shows because I've been already working on it for a day. I've expanded the navigation and slimmed down the landing page. I've done it to get better results on PageSpeed Insights and make rugpullindex.com perform better in search engine results. As a result, there's now an about page and this blog. I'm planning to deprecate the old /changelog.txt.

Another SEO-thing I've done is that I've added a /sitemap.xml for crawlers. I'm tracking the website's performance on Google's Search Console now too. My plan is to make the website more informative over time.

And that's all I've to say for today. I hope you like the changes. And I also wanted to thank everyone that voted for me in the OceanDAO too. Thanks!

Hoping to see you around here soon again.

Best, Tim

§ 01/02/2021

§ 20/01/2021

§ 20/01/2021

Wow, it's been a while since I wrote something here. Still, I was busy thinking about next steps for rugpullindex.com. Mainly, about receiving funding to being able to continue the project.

And, indeed, I'm recognizing a promising opportunity ahead with Ocean Protocol's "OceanDAO" [1] having its second grants funding round on Feb 1, 2021. On Monday, it lead me to write a first draft for a grants proposal [2]. OceanDAO recommends submitting an "Expected ROI calculation" in the grants proposal to make voters understand the potential and future returns of the project[5]. However, it turned out, that DeFi Pulse Index isn't able to capture a significant market share within the DeFi ecosystem (0.03% or $55M). When applying the percentage to rugpullindex, the prospect became even bleaker as 0.03% of $600k would only amount to $183 of market capture for rugpullindex.com

Even though, it did disappoint me that the math wasn't working it, I'm still bullish as ever towards the project. Especially, as I recently read in one of Matt Lavine's "Money Stuff" newsletter posts, that tradiitonal index funds can become huge anti trust problems as soon as they start to hold majority shares in certain market segments [3]. When, for example, the S&P500 is suddenly capable of voting on board decisions of FAANG (Facebook, Amazon, Apple, Netflix, Google), I think it's no surprise that they wouldn't incite any of those companies against each other. After all, that could lead to a decrease in the index's value.

To me, that truly sounds like an antiquated problem. Technology allows the sensing of a crowds opinion already. Within blockchain, such governance scenarios have long been a topic of discussion. Actually, they work today [4]. And that's why I think that building indexes on blockchains is a cool problem that can address real-life problems.

In conclusion, I would like to say that I'm still eager to continue development here. I hope to receive a grant. So if you're reading this, make sure to vote!

That's all. Have a nice day.

§ References

§ 02/01/2021

§ 01/01/2021

§ 16/12/2020

To increase virality of the service, I've decided that I want to have some type of badge for a data set provider. I ended up using shields.io. By visiting the FAQ, you can now add a badge for your own data set. It's a beta features that I haven't testet too much. So I'm curious on how it goes.

§ 11/12/2020

Released the rugpullindex.com launch blog post on my personal website: https://timdaub.github.io/2020/12/11/rugpullindex/

It got lots of attention which made me happy. Lots of people have reached out since then.

§ 09/12/2020

§ 7/12/2020

This morning, when I had my coffee in the park, I thought again about what I wrote last week regarding the inclusion of liquidity into my risk model. I'm specifically referring to the changelog.txt entry on the 30/11/2020, where I proposed to use the absolute currency value of liquidity within a pool to multiply it with the Gini score.

Thinking about it again, I realized that I don't like the approach I proposed then anymore. The reason being, that by using e.g. the EURO value of a pool's liquidity in a multiplication seems fairly arbitrary. Why e.g.

After all, the Gini score and each market's liqudity are independently-provisioned quality measurements. Hence, this morning, I started thinking about how to improve what I proposed last week.

I believe that a relative quality measure that is a combination of liquidity and equality distribution is still useful for investors. I think it should not be denoted in a commonly known unit, unless is makes a specific quality statement about it.

For example, in the future, I could imagine a quality measure called "Safe liqudity" that is denoted in OCEAN, EUR or USD and that gives information about the absolute amount of liqudity that is safely distributed within a pool.

However, for now I'm not interested in that measure. Instead, I'd like to use a comprehensive and relative measure of liqudity over all markets as a measure of an indivdual pool's liquidity. Actually, my friend Jost Arndt proposed a simple algorithm to find a relative measure for all pools' liquidity:

  1. Among all pools, find the one with the highest liquidity: L
  2. For each pool, where a pool's liquidity is l and the relative liquidity score is Rl: Rl = L / l

His argument was that now, since all pools' liquidity is within the bountries of 0 < Rl < 1, this measure could be used to find an overall score s to rank all data sets:

s = Rl * (1 - gini)

The properties of this model are great because:

However, I'm not only a fan of the algorithms properties. From the get-go of this project, I've been convinced that a simple measure is key for the meaningfulness and utility of the index. I believe that the above formula passes those criteria. Hence, for the upcoming weeks, I'm planning to integrate it into the website.

And that's all for today's thoughts on rugpullindex.com. If you've found this entry useful or have feedback, feel free to reach out via tim@daubenschuetz.de

Best, Tim

§ 1/12/2020

The root endpoint / now includes a "Cache-Control" header with a maxAge around the time of rugpullindex.com's daily crawl. This means that a user's browser is now caching the site. But additionally this allows a CDN or reverse proxy to cache the site too. For now, I've configured my reverse proxy to cache according to "Cache-Control" headers which speed up page loads significantly. Since for most of the day, statically-cached content is served up now, this should allow handling lots of traffic too.

§ 30/11/2020

Currently, I'm still thinking a lot about rugpullindex.com and how to grow its audience. I believe that in the future, it will be really important to be able to automatically filter and sort blockchain-based markets on some sort of metric, similar to how the Web is sorted by algorithms today (social media algorithms, Google's page-rank, etc.).

In terms of improving the site in the short term, I'm hence driven to do two things in particular:

  1. Improving the scoring method;
  2. Improving the site's documentation and transparency.

Regarding (1), improving the scoring method, I already had a particular idea that I'd like to motivate briefly.

Most decentralized exchanges using automated market makers currently use liquidity to measure a pool's overall performance. However, as we've discussed already, this ignores the fact that distinct liquidity can have distinct quality. As we've assumed from the beginning, the distribution of liqudity shares within the pool can be used as a qualitative metric. Some examples:

Hence, instead of sorting the index only by a pool's liquidity distribution, I'm now thinking of using the score as a weight on the pool's liquidity:

Score_new = Liquidity(pool) * (1 - Gini(pool-shares))

For a pool like TREPEL-36, this would mean the following (values from today): At a score of 0.69 and a total liquidity of 40900.54€, its new score is:

40900.54€ * (1 - 0.69) = 12679,17€

whereas for TASLOB-45, having a score of 0.88 with a total liquidity of 224665.20€, it meant:

224665.20€ * (1 - 0.88) = 26959,82€

This change, as can be seen above, would then favor large pools over small ones, while still being significantly biased towards an equal distribution of shares.

If you've made it so far: Thanks for reading! And if you have feedback on this idea, feel free to contact me! My email is tim@daubenschuetz.de

That's all for today.

Best, Tim

§ 27/11/2020

§ 24/11/2020

§ 21/11/2020

§ 19/11/2020

§ 18/11/2020

§ 17/11/2020

§ 16/11/2020

§ 14/11/2020

§ 13/11/2020