Scheme3 is pioneering the next generation of the digital landscape. Despite advancements, challenges like fragmented and inaccessible on-chain data persist. That's why Indicant Analytics has introduced a comprehensive data solution that harnesses AI technology to automate blockchain data collection, processing, and analysis.
This initiative aims to establish cross-industry data standards, making it more accessible for developers and analysts to retrieve and interpret information.
Navy Tse, founder and CEO of Indicant Analytics shared with us:
“We aspire to become the 'Google Analytics' of Web3, delivering real-time growth analytics and powerful analytical tools to help Web3 projects achieve their objectives using our dataset spanning over 20 public chains and integrated data. Simultaneously, we're exploring the convergence of AI and data, such as creating data analysis panels through AI, to further accelerate innovation in the blockchain sector.”
Navy believes that the integration of AI and blockchain will transform data adoption in Web3. On one hand, high-quality data forms the foundation for training AI models; conversely, AI can help generate high-quality data:
“Data is the vital essence of the industry. We're striving to build a symbiotic ecosystem where AI and blockchain complement each other, thereby driving the advancement of the entire Web3 landscape.”
Q1: Navy, could you provide us with an overview of what Indicant Analytics is currently working on?
Indicant Analytics is dedicated to creating an integrated data platform that bridges the gap between Web2 and Web3 data.
We specialize in structuring data. Despite the relative advantages of Web3 over Web2 in terms of on-chain data accessibility, certain challenges remain. These include the emerging nature of the industry, a lack of standardized practices, and unstructured data. Consequently, data utilization becomes problematic.
To illustrate, consider the scenario where you want to access transaction data on OpenSea from major chains such as Ethereum, Solana, and Polygon. This process involves understanding OpenSea's operational framework, studying smart contract code, and sequentially extracting transaction data from each chain.
This process is complex. First and foremost, it's complicated and error-prone throughout the data collection process. Second, it presents technical challenges, given the differences in contract architecture and data structures across chains. Finally, it results in resource wastage. In a scenario where 1,000 people need this data, they would have to undergo a similarly complex process 1,000 times. This significant repetition severely hinders data collection efficiency and wastes technical resources.
This brings us to the mission of Indicant Analytics: to extract data from diverse sectors such as GameFi, NFTs, and DeFi and establish standardized data practices for the Web3 industry. This, in turn, will enable developers and industry participants to access and analyze data efficiently and accurately.
To date, we've launched platforms on more than 20 blockchains, organized into three core segments:
Q2: Integrating AI with Web3 has become a fascinating trend today. Each technology, GPT or AIGC, has demonstrated tremendous potential in aligning AI with its unique capabilities. Navy, please elaborate from the perspective of the data landscape. Let's explore how AI can be seamlessly integrated with Web3. This approach can be examined from both technical and application perspectives to envision the various possibilities of this convergence.
As a data platform, Indicant is a natural fit with AI. AI encompasses three key elements: computing power, data, and algorithms. Among these, computing power forms the foundation that supports AI model training and implementation. Simultaneously, data constitutes the core of AI, and algorithms determine AI performance, including model quality and application capabilities.
Of these, data is undoubtedly the most crucial and vital. Data is the lifeblood of industries and projects, and its importance extends to key areas such as privacy and security, where its value is immeasurable. Data may be beyond valuation, given its significance in privacy and security matters. AI functions as both a consumer and a generator of data.
Currently, Indicant's implementation of the integration of data and AI encompasses several specific aspects:
During the data content generation phase, the role of AI within our platform is critical. Initially, we utilize AI to create data processing code, providing users with a more streamlined data analysis experience.
More specifically, we are driving innovation in two specific directions.
First, we are curating and categorizing protocol data. Taking smart contracts deployed on the blockchain as an example, our AI can autonomously identify the protocol to which a contract belongs, the type of contract, and even whether the contract falls under categories such as LP or Swap on DEX platforms. This proactive structuring and categorization significantly enhances data accessibility.
Second, we can generate higher-level domain data based on our protocol data. For instance, we use AI to generate data within domains such as GameFi, NFT, etc., providing users with richer data resources. This approach improves the quality of data content and enables users to better understand data across diverse industries.
To enhance the front-end user experience, we have introduced an AI-powered interactive analysis feature. As mentioned above, when users engage with Indicant for data analysis, they encounter an experience similar to a conversation with ChatGPT. Users can ask questions and directly receive relevant data analysis reports. The underlying system involves converting text into SQL queries, dramatically lowering the entry barrier for data analysis.
Finally, when it comes to user support, we've developed an AI-enhanced customer service bot. We equip AI with data from Indicant, spanning GameFi, NFT, DeFi, and other areas, to build a customized AI customer service bot for Indicant. This AI bot provides immediate assistance to users by answering questions related to the use of Indicant, including data types, data definitions, API usage, etc. This greatly increases the efficiency of customer support while reducing the amount of search effort.
Additionally, it's worth noting that while AI applications can improve efficiency and help solve most challenges, they may not be infallible. Based on our data processing experience, AI can assist in resolving approximately 70% to 80% of challenges.
Q3: What challenges are likely to arise in integrating AI with Web3? Are there issues related to technical complexity, user experience, intellectual property rights, or ethical considerations?
From a broader perspective, regardless of the field in which AI is applied, a critical consideration is the level of acceptance of AI's error tolerance. Different application scenarios have different error tolerance requirements. There's a need to balance the quality and reliability of AI against people's tolerance for mistakes.
For example, in healthcare, the decision to trust either AI or a physician may present trust-related challenges. In the investment space, AI can provide factors that affect the performance of BTC prices, but people may still have doubts when making actual buy or sell decisions.
Conversely, high precision may not be paramount in marketing and growth analytics, such as user profiling and segmentation, because minor errors won't significantly impact outcomes. As a result, error tolerance is more readily accepted in these contexts.
Currently, Indicant is primarily focused on data in its efforts to integrate AI with Web3, which presents its own set of challenges:
First, the initial challenge is data generation, specifically providing high-quality data for AI to achieve more efficient and accurate data generation capabilities. This relationship between AI and data can be compared to the engine and fuel of a car, where AI is the engine and data is the fuel. No matter how advanced the engine, a lack of quality fuel will prevent optimal performance.
This raises the question of how to generate high-quality data, for example, how to quickly and automatically generate data in areas such as GameFi, NFTs, DeFi, and others. This includes automatically organizing the data connections, essentially creating a data graph. More specifically, it involves determining factors such as the protocols to which contracts are related, the types of contracts, the providers, and other relevant information. The primary goal of this process is to consistently provide the AI with high-quality data to improve its efficiency and quality in data processing, thus creating a virtuous cycle.
The second challenge is data privacy. While Web3 is inherently transparent and open, the need for privacy may become paramount as the industry evolves. This includes protecting users' identities, assets, and transaction information. This situation presents a dilemma: the transparency of data on the blockchain gradually decreases, limiting the amount of data accessible to AI. However, this issue will be addressed as the industry progresses, and homomorphic encryption is a potential solution.
In conclusion, the integration of AI and Web3 is inherently intertwined with a fundamental problem: data accessibility. In essence, the ultimate challenge for AI lies in its access to high-quality data.
Q4: While AI is not a new concept, the integration of AI and Web3 is still in its infancy. So, Navy, what potential areas or combinations of AI within Web3 do you believe could serve as a breakthrough that would attract a significant number of users to Web3 and promote data adoption?
I believe achieving significant integration and adoption of Web3 and AI depends on addressing two fundamental challenges. First, there's a need to provide enhanced services to Web3 builders and developers, especially in areas such as GameFi, NFTs, and social platforms. Second, it's crucial to reduce the barriers on the application front to ensure a smoother user entry into the Web3 ecosystem.
Let's start with serving the developer community. In this area, two primary types of applications stand out.
One category is AI-powered development platforms. These platforms utilize AI technology to automate the process of code templates. Whether for building DEX platforms or NFT marketplaces, these platforms can intelligently generate code templates tailored to the specific needs of developers, significantly increasing development efficiency.
In games, AI can accelerate the process of game models and the generation of images, thus speeding up the game development and deployment process. These platforms have allowed developers to focus more on content and innovation rather than excessive time on repetitive, basic tasks.
The other category revolves around AI-enhanced data platforms. These platforms use AI to autonomously generate domain-specific data in various industries such as GameFi, NFTs, SocialFi, and DeFi. The goal is to expand the scope for developers to use and apply data, and simplify data analysis and utilization.
Through AI, these platforms can automatically generate various data sets, enriching developers with abundant data resources and enhancing their understanding of market trends, user behavior, and more. By providing developers with comprehensive data support, these data platforms lower data utilization barriers and foster innovative applications' emergence.
Data adoption has always been a key challenge in the Web3 space. For example, the industry has seen the emergence of blockchain solutions with virtually negligible fees aimed at increasing transactions per second (TPS). In addition, solutions such as the MPC wallet effectively address the primary barrier to migration from Web2 to Web3 by overcoming migration challenges.
The solution to these challenges doesn't depend solely on AI technology but is intertwined with the holistic evolution and development of the Web3 ecosystem. While AI plays a crucial role in improving efficiency and reducing barriers, the underlying progress and growth of Web3 remain key factors in solving the data adoption problem.
Navy Tse serves as the co-founder and CEO of Indicant Analytics, a cutting-edge platform specializing in discovering and visualizing blockchain data.