84.93K
2.05M
2024-09-20 09:00:00 ~ 2024-10-22 07:30:00
2024-10-22 12:00:00
Total supply1.00B
Resources
Introduction
Scroll is a Layer 2 rollup solution using zero-knowledge proof technology to scale the Ethereum blockchain, with a mission to bring billions of users into Ethereum's ecosystem, become the most secure and trusted Layer 2 network to process trillions of dollars on-chain, and be the default platform for new innovations. SCR total supply: 1,000,000,000
Original Title: Crypto Marketing Trends & Predictions: 2026 and Beyond Original Author: @emilyxlai Translated by: Peggy, BlockBeats Editor's Note: Marketing in the crypto industry is undergoing a profound transformation: trend lifecycles are getting shorter, competition is fiercer, and traditional tactics are gradually losing effectiveness. For entrepreneurs, growth leads, and marketing teams, understanding these changes is not just a matter of survival, but the key to gaining a competitive edge. This article, based on a speech by Emily Lai, CMO of Hype Partners, systematically outlines the 7 core trends in crypto marketing for 2026, covering performance marketing, content creation, channel diversification, event experiences, incentive mechanisms, and AI-driven operations. It also shares industry predictions and a framework for staying ahead. The industry is evolving rapidly—how can you seize opportunities and avoid falling behind? This article gives you the answer. The following is the original text: The crypto industry changes in the blink of an eye, with extremely short attention cycles; trends emerge quickly and disappear even faster, with lifecycles becoming increasingly compressed. At the g(t)m con1 conference held last Sunday (November 16), I shared my observations and experiences from the past year and offered forward-looking insights for 2026. The core of this talk was to share with entrepreneurs, growth leads, and marketing experts our team's outlook on the industry's future, discuss what this means for your marketing strategy, and how to stay ahead in the competition. Ten Months Can Change a Lot Since my keynote at the EthDenver conference in February 2025, we have witnessed: over 319 new stablecoins launched; institutions and Wall Street entering the space, including enterprise blockchains, DAT, ETF, and fintech giants adopting stablecoins; a relaxed regulatory environment, the introduction of the GENIUS Act, and the US welcoming a "crypto-friendly" president; new token issuance up over 27%, reaching 567 million tokens at the time of writing; a surge in crypto payment card options, with card transaction volume on traceable blockchains hitting $375 million in October 2025 alone; an explosion in prediction markets, with @Kalshi and @Polymarket setting new trading volume records and new players entering; and new banks and mobile-first financial apps launching on crypto rails. Crypto in 2024 vs. Today Last November, the first g(t)m con was held in Bangkok. The main trends at the time included: team-led marketing, founder personal branding, AI agents, interactive "reply masters," brand mascots, airdrops, intern accounts, and the mysterious concept of "mindshare" (brand awareness) proposed by the InfoFi platform. One year later, the industry landscape has clearly shifted: from the APAC liquidity focus, to the return of ICOs, to the rise of "CT Leads," the pace of change in crypto is astonishing. User Mindshare ≠ Growth RIP Mindshare Over the past year, many highly anticipated TGEs (Token Generation Events) saw weak buying pressure and price performance far below crypto Twitter (CT) sentiment expectations, even under high attention. From a KPI perspective, the industry has refocused on user acquisition (covering both B2B and B2C) and retention. At the narrative and industry meta-trend level, ecosystems and applications are emphasizing "revenue and buybacks." Internal discussions are also centered on token strategies, tokenomics, and incentive design to alleviate sell pressure. As infrastructure, base protocols, and middleware mature, the industry's focus is shifting from chains and ecosystems to applications. When traditional financial institutions start deploying capital and fintech apps with millions of users integrate blockchain rails, this not only brings legitimacy to the entire industry, but more importantly, allows us to reach new users beyond CT. As user experience improves, new applications emerge, and trust is built, the addressable market and audience continue to expand. This also means that Web2 user acquisition strategies, previously considered negative ROI/ROAS, are becoming reasonable again. Hot and Not: Trend Review Below is a subjective and incomplete "in and out" list. I first compiled my own views, then gathered opinions from a crypto VC friend, as well as from crypto marketing group chats and CT. Afterwards, I broke down these trends and observations into 7 themes, providing a high-level overview and synthesis, summarizing my learnings and observations from 2025. The original talk was only supposed to be 25 minutes, but thanks to @clairekart's flexibility, I was able to share in a "stream of consciousness" style for 45 minutes on stage. Performance Marketing At last November's g(t)m con in Bangkok, I shared about data-driven marketing, focusing on funnel models and key metrics. What seemed important then is even more relevant now. Performance marketing is making a comeback because the industry is refocusing on user acquisition and retention. This means: installing tracking tools (on-chain, product/web, distribution channels); growth experiments; combining paid and organic traffic; evolving from social tasks to liquidity tasks; precise KOL marketing campaigns, etc. We're seeing more projects using or inquiring about tools such as: @spindl_xyz, @gohypelab, @themiracle_io: for native wallet placements @tunnl_io, @yapdotmarket: targeted bounty campaigns for small KOLs @turtledotxyz, @liquidity_land: for liquidity marketing campaigns There are also more precise strategies: I've spoken with some perpetual DEXs that use "white glove" user onboarding, even DMing whale users one-on-one, or leveraging APAC trading KOLs for initial traffic (with incentives, of course). Meanwhile, Web2 paid ad channels are back in focus, including paid social, search ads, and out-of-home (OOH) advertising. Telegram ads remain an underrated channel. In the future, as LLM and OpenAI build ad product suites, we'll see new ad placement scenarios emerge. Content, Content, and More Content This year, we've seen an explosive growth of content creators and video on social platforms, with timelines flooded by all kinds of content: from vloggers and short video creators to technical explainers, livestreams, and even cinematic storytelling... Meanwhile, the InfoFi platform has driven the rise of "brand ambassador" roles, people who actively post ("yap") to promote projects in hopes of earning rewards. However, I believe this trend won't last—"yappers" are already on my OUT list. Leaving the DevConnect venue last week, I joked that DJI's revenue must be soaring, as microphones and cameras were everywhere. We're in the season of content creators. Some creators are freelancers making content for brands they love, like @coinempress and @DAppaDanDev. Brands are also hiring full-time content creators to make videos, vlogs, host Spaces, and even leverage creators' personal brands (like CT Leads @alexonchain). @dee_centralized is one of the leaders of the short video crypto wave. Six weeks ago, I visited @solana's New York office and toured Solana Studio—a content space designed for founders and creators, where people like @bangerz and @jakeclaychain produce content. We're also seeing brands hire actors, Hollywood-level studios, and photographers to produce high-quality content and ads. @aave has started ramping up content on Instagram (to warm up its retail mobile app—a smart strategy), while @ethereumfnd has brought in storytelling creators like @lou3ee. Content formats are diversifying: beyond text and video, there are livestream series (like @boysclubworld), static series, podcasts, short video clips, 3D or AI announcement videos, etc. @OctantApp provides grants for creators, and I recently hosted a workshop on the psychological factors brands value in content creation. At Hype (@hypepartners), we held 4 content creator workshops during DevConnect week and brought in @web3nikki in January to lead a new short video department. Content will continue to saturate; quality, depth, and production value will become more important, and reaching new users beyond CT is equally crucial. The World Beyond X This year at Hype, we've explored (and re-explored) new channels, including YouTube, Reddit, AI SEO (like Perplexity, GPT), Instagram, and Whop. In my talk, I focused on LinkedIn and TikTok. Take @Scroll_ZKP co-founder @sandypeng as an example: for those not active on LinkedIn, she posted consistently throughout 2025, growing from zero to 6.3 million impressions and 31,000 followers, and shared her strategy and data (first time publicly, thanks Sandy). Sandy Peng (co-founder of Scroll)'s LinkedIn In January, we noticed a clear increase in brand demand for channels like Instagram, YouTube, and TikTok, so we brought in @web3nikki to establish a short video department focused on brand growth and user acquisition, with a special focus on TikTok. The team is made up entirely of TikTok natives, familiar with the algorithm, skilled at creating viral content, and able to adapt content strategies to a crypto perspective. Since the department was established, we've worked with 12 clients, accumulating a wealth of experience and insights. Events Are Becoming More Immersive and Exclusive As crypto conference side events become heavily saturated (often over 500 in a single week), organizers are competing fiercely to attract participants. This trend extends to swag: higher quality, better design, and exclusive giveaways. This year, we've seen a significant increase in private dinners. @metamask set a new standard at July's EthCC Cannes event: invite-only guests, taking KOLs and content creators on speedboats, helicopters, and planes. @raave continues to set the standard for crypto music events, inviting world-class DJs and creating top-tier stage effects. Ticket access is tiered, exclusive, and released through a series of marketing campaigns. This experience isn't limited to the real world—it extends to digital: airdrop unboxings, mini-games, Buzzfeed-style personality quizzes, and other shareable interactive experiences are on the rise. We're seeing more inspiration drawn from Web2 brand events, pop-up concepts, and influencer happenings brought into crypto. Last week, we co-hosted a candlelight concert with @octantapp; you can see clips from the event here. Attendance was invite-only, as the venue couldn't accommodate all 20,000 people. If you'd like to join the next experience, contact @cryptokwueen or me. Reshaping and Redesigning Incentive Mechanisms This year, we've seen incentive campaigns shift from airdrops back to some new forms of privilege. Some incentives are positioned as perks: "Being able to buy this token is a privilege in itself" (similar to NFT whitelists in 2021) "Buy now and you'll get the privilege of a discounted purchase" "Stake now to earn higher yields and/or points from multiple protocols" "To get the most airdrops, discounts, or points, you must reach top-tier membership" (like airline and hotel loyalty tiers) All of this reminds me of banks and Web2 fintech companies, which package product usage and access as a privilege. My Chase emails often say: "Congratulations! You are pre-qualified for mortgage refinancing." In the future, we'll continue to see incentive programs evolve, increasingly resembling loyalty and status tier programs. AI in Marketing and Operations These are the AI trends I've seen in marketing, as well as our experience building an internal "context engine" at Hype. In September, we established the Hype AI department, led by @antefex_moon (our VP of AI). For more details, see CEO @0xDannyHype's introduction. We're testing AI extensively at every stage to improve work quality, research, operations, data measurement, and project management. This requires ongoing testing and iteration. We've also launched a new service line: AI SEO / LLM SEO, which ensures your company appears in AI prompts, depending on whether you're in the right place in the training data. Web2 tools like Ahrefs and SEMrush have started offering AI visibility measurement. Meanwhile, OpenAI has officially announced it is exploring an ad platform, which will bring new ad placement scenarios and marketing strategies. Other Predictions The above trends and observations have directly influenced some of the business and marketing decisions we've made at Hype. Before sharing my "stay ahead" framework, I collected predictions on crypto marketing from the Hype team. You can read perspectives from @0xdannyhype, @ChrisRuzArc, @groverGPT, @izaakonx, @Timmbo_Slice, and others: How to Stay Ahead Trend lifecycles are getting shorter and shorter, due to: Weaker moats (for example, with AI, the internet, and tools, it's easier than ever to create content) The crypto industry has a limited audience size New companies emerge constantly, competing for attention every day Marketing requires continuous innovation, testing, and experimentation. Teams that adopt new strategies first can leverage the "novelty effect" to capture brand awareness—until the strategy becomes saturated in the market. You can also retest old strategies and aesthetics to rekindle a sense of "freshness." It's a never-ending cycle. When others turn left, you turn right; when everyone is turning left and right, you sit under a tree, enter a higher dimension, and explore untouched territory. Then repeat the process. To stay ahead, you must: keep up with industry trends; draw inspiration from outside crypto; think from first principles (which requires brainstorming, deep thinking, and evaluation—not just copying others). Some questions to help you define predictions and marketing bets include: Which trends will become obsolete in the next 6-12 months? Which strategies work in Web2 or other industries but haven't been applied in crypto? Which user behaviors or technological changes will reshape marketing? Ultimately, you're betting on the future. And betting on the future means seeing the patterns and then imagining better possibilities. Recommended Reading: Rewriting the 2018 Playbook: Will the End of the US Government Shutdown = Bitcoin Price Surge? 1.1 billions USD in Stablecoins Evaporate: The Truth Behind the DeFi Domino Crash? MMT Short Squeeze Review: A Carefully Designed Money Grab Game
Against the backdrop of a continued surge in global AI infrastructure demand, traditional centralized cloud computing systems have gradually revealed capacity bottlenecks and efficiency ceilings. With the rapid penetration of large model training, AI inference, and agent applications, GPUs are transforming from "computing resources" into "strategic infrastructure assets." Amid this structural market transformation, Aethir, leveraging a decentralized physical infrastructure network (DePIN) model, has built the largest and most commercially advanced enterprise-grade GPU computing network in the industry, swiftly establishing a leading position. Commercialization breakthrough of large-scale computing infrastructure: To date, Aethir has deployed over 435,000 enterprise-grade GPU containers worldwide, covering the latest generation of NVIDIA hardware architectures such as H100, H200, B200, and B300, delivering more than 1.4 billion hours of real computing services to enterprise clients. In just the third quarter of 2025, Aethir achieved revenue of $39.8 million, driving the platform's annual recurring revenue (ARR) past $147 million. Aethir's growth is driven by genuine enterprise-level demand, including AI inference services, model training, large AI Agent platforms, and production-grade workloads from global game publishers. This revenue structure marks the first time the DePIN sector has seen a large-scale computing platform driven primarily by enterprise-level payments. Against the backdrop of a continued surge in global AI infrastructure demand, traditional centralized cloud computing systems have gradually revealed capacity bottlenecks and efficiency ceilings. With the rapid penetration of large model training, AI inference, and agent applications, GPUs are transforming from "computing resources" into "strategic infrastructure assets." Amid this structural market transformation, Aethir, leveraging a decentralized physical infrastructure network (DePIN) model, has built the largest and most commercially advanced enterprise-grade GPU computing network in the industry, swiftly establishing a leading position. Commercialization Breakthrough of Large-Scale Computing Infrastructure To date, Aethir has deployed over 435,000 enterprise-grade GPU containers worldwide, covering the latest generation of NVIDIA hardware architectures such as H100, H200, B200, and B300, delivering more than 1.4 billion hours of real computing services to enterprise clients. In just the third quarter of 2025, Aethir achieved revenue of $39.8 million, driving the platform's annual recurring revenue (ARR) past $147 million. Aethir's growth is driven by genuine enterprise-level demand, including AI inference services, model training, large AI Agent platforms, and production-grade workloads from global game publishers. This revenue structure marks the first time the DePIN sector has seen a large-scale computing platform driven primarily by enterprise-level payments. Aethir's infrastructure has been integrated into the core production systems of several cutting-edge AI companies. 1. Kluster.ai, relying on Aethir's computing network, has compressed clinical trial patient screening processes that originally took months down to minutes, greatly enhancing the commercial viability of medical AI. 2. Attentions.ai builds and deploys enterprise-grade private large models through Aethir, promoting the implementation of no-code AI platforms in traditional industries. 3. Mondrian AI, selected for the "KOREA AI STARTUP 100," uses Aethir as the underlying computing power for its enterprise-grade AI services. At the gaming industry level, Aethir's production-grade delivery capabilities have undergone large-scale commercial validation. 1. SuperScale's test data shows that products based on Aethir's instant cloud gaming architecture have increased user preference by 43%, click-through rates by 35%, and final paid conversion rates by 45% compared to traditional download methods. 2. In Reality+'s "Doctor Who: Worlds Apart" project, Aethir drove a 201% increase in installation conversion rates and a 61% increase in ARPU. Currently, more than 400 games have been connected for testing through Xsolla, and leading global publishers such as Scopely, Zynga, and Jam City are conducting in-depth evaluations of the system. Institutional Endorsement: The Birth of Strategic Compute Reserve In October 2025, Aethir completed a $344 million ATH token targeted investment (NASDAQ: POAI) and officially launched the Aethir Digital Asset Treasury (DAT). This mechanism is positioned as the world's first "Strategic Compute Reserve" (SCR) framework, aiming to incorporate decentralized computing assets into enterprise-level long-term balance sheets and explore deep integration paths between computing assets and traditional capital markets. As of November 10, 2025, the DAT has disclosed holdings of 5.7 billion ATH and plans to provide services to AI companies by deploying GPU resources, using the resulting revenue to repurchase ATH, thereby forming a positive cycle of "computing supply—enterprise monetization—ecosystem buyback." This structural capital move makes Aethir one of the very few platforms in the DePIN field to receive substantial recognition from traditional capital markets, and it is also the first time that "decentralized computing power" has been elevated to the level of institutional asset allocation. Enterprise-Grade Delivery Capability Under Decentralized Architecture The decentralized GPU network built by Aethir has already matched or even surpassed traditional centralized cloud providers in terms of performance, stability, and cost structure. H100-grade GPUs currently support over 90% of mainstream large model inference tasks worldwide, and in the first quarter of 2025, NVIDIA allocated 60% of its production capacity specifically to enterprise AI clients, further highlighting the strategic scarcity of high-end GPUs. Aethir, through a distributed hardware supply system, bypasses the construction cycles and supply chain bottlenecks of traditional data centers, providing enterprises with near bare-metal computing performance at higher resource utilization rates and more flexible pricing, while significantly reducing overall usage costs. DePIN Enters the "Real Revenue Driven" Era According to McKinsey's forecast, global data center construction investment will reach $6.7 trillion by 2030. Meanwhile, the DePIN market is expected to grow to $3.5 trillion by 2028. However, only platforms with genuine enterprise revenue capabilities and scalable delivery capacity are qualified to establish a long-term moat in this sector. About Aethir Aethir is a world-leading decentralized GPU cloud infrastructure platform dedicated to providing enterprise-grade computing services for AI, gaming, and next-generation Web3 applications. Through a distributed GPU network architecture, Aethir transforms idle computing power worldwide into cloud-level resources that enterprises can access instantly, building a more open, efficient, and decentralized digital infrastructure.
Original source: Aethir Against the backdrop of surging global demand for AI infrastructure, traditional centralized cloud computing systems have gradually revealed capacity bottlenecks and efficiency ceilings. With the rapid penetration of large model training, AI inference, and agent applications, GPUs are transforming from "computing resources" into "strategic infrastructure assets." Amid this structural market transformation, Aethir, through its decentralized physical infrastructure network (DePIN) model, has built the industry's largest and most commercialized enterprise-grade GPU computing network, quickly establishing a leading position in the sector. Commercial Breakthroughs in Scalable Computing Infrastructure To date, Aethir has deployed over 435,000 enterprise-grade GPU containers worldwide, covering the latest NVIDIA hardware architectures such as H100, H200, B200, and B300, and has delivered more than 1.4 billion hours of real computing services to enterprise clients. In just the third quarter of 2025, Aethir achieved revenue of $39.8 million, driving the platform's annual recurring revenue (ARR) to surpass $147 million. Aethir's growth is driven by genuine enterprise-level demand, including AI inference services, model training, large AI Agent platforms, and production-grade workloads from global game publishers. This revenue structure marks the first time in the DePIN sector that a scalable computing platform is driven primarily by enterprise-level paid usage. Aethir's infrastructure has been integrated into the core production systems of several cutting-edge AI companies. 1. Kluster.ai leverages Aethir's computing network to compress clinical trial patient screening processes, which originally took months, down to minutes, significantly enhancing the commercial viability of medical AI. 2. Attentions.ai builds and deploys enterprise-grade private large models through Aethir, promoting the implementation of no-code AI platforms in traditional industries. 3. Mondrian AI, selected for "KOREA AI STARTUP 100," uses Aethir as the underlying computing support for its enterprise-grade AI services. At the gaming industry level, Aethir's production-grade delivery capabilities have undergone large-scale commercial validation. 1. SuperScale's test data shows that products based on Aethir's instant cloud gaming architecture have increased user preference by 43%, click-through rates by 35%, and final paid conversion rates by 45% compared to traditional download methods. 2. In Reality+'s "Doctor Who: Worlds Apart" project, Aethir drove a 201% increase in installation conversion rate and a 61% increase in ARPU. Currently, over 400 games have been integrated for testing via Xsolla, and leading global publishers such as Scopely, Zynga, and Jam City are conducting in-depth evaluations of the system. Institutional Endorsement: The Birth of Strategic Compute Reserves In October 2025, Aethir completed a $344 million ATH token targeted investment (NASDAQ: POAI) and officially launched the Aethir Digital Asset Treasury (DAT). This mechanism is positioned as the world's first "Strategic Compute Reserve" (SCR) framework, aiming to incorporate decentralized computing assets into enterprise-level long-term balance sheet systems and explore deep integration paths between computing assets and traditional capital markets. As of November 10, 2025, the DAT has disclosed holdings of 5.7 billion ATH tokens and plans to provide services to AI companies by deploying GPU resources, using the resulting revenue to buy back ATH, thereby forming a positive cycle of "compute supply—enterprise monetization—ecosystem buyback." This structural capital move makes Aethir one of the very few platforms in the DePIN sector to receive substantive recognition from traditional capital markets, and it is also the first to advance "decentralized computing power" to the level of institutional asset allocation. Enterprise-Grade Delivery Capabilities Under a Decentralized Architecture The decentralized GPU network built by Aethir has already matched or even surpassed traditional centralized cloud providers in terms of performance, stability, and cost structure. H100-grade GPUs currently support over 90% of mainstream large model inference tasks globally, and in the first quarter of 2025, NVIDIA allocated 60% of its production capacity specifically to enterprise AI clients, further highlighting the strategic scarcity of high-end GPUs. Through a distributed hardware supply system, Aethir bypasses the construction cycles and supply chain bottlenecks of traditional data centers, offering enterprises near bare-metal computing performance with higher resource utilization and more flexible pricing, while significantly reducing overall usage costs. DePIN Enters the "Real Revenue Driven" Era According to McKinsey, global data center construction investment will reach $6.7 trillion by 2030. Meanwhile, the DePIN market is expected to grow to $3.5 trillion by 2028. However, only platforms with genuine enterprise revenue capabilities and scalable delivery capacity are qualified to build a long-term moat in this sector. About Aethir Aethir is a global leader in decentralized GPU cloud infrastructure platforms, dedicated to providing enterprise-grade computing services for AI, gaming, and next-generation Web3 applications. Through a distributed GPU network architecture, Aethir transforms idle computing power worldwide into cloud-level resources that enterprises can access instantly, building a more open, efficient, and decentralized digital infrastructure.
Source: Aethir Against the backdrop of global AI infrastructure demand explosion, the traditional centralized cloud computing system has gradually revealed capacity bottlenecks and efficiency ceilings. With the rapid penetration of large-scale model training, AI inference, and intelligent agent applications, GPUs are transitioning from "computing resources" to "strategic infrastructure assets." In this market's structural transformation, Aethir has rapidly established itself as an industry leader by building the largest and most commercially advanced enterprise GPU computing network based on a decentralized Physical Infrastructure Network (DePIN) model. Commercial Breakthrough in Scalable Computing Power Infrastructure To date, Aethir has deployed over 435,000 enterprise-grade GPU containers worldwide, covering the latest NVIDIA hardware architectures such as H100, H200, B200, and B300, delivering over 1.4 billion hours of actual computing services to enterprise customers. In just the third quarter of 2025, Aethir achieved revenues of $39.8 million, driving the platform's Annual Recurring Revenue (ARR) past $147 million. Aethir's growth stems from genuine enterprise-level demands, including AI inference services, model training, large AI Agent platforms, and production-level workloads for global game publishers. This revenue structure marks the first appearance of a scalable computing power platform in the DePIN track, with enterprise payments as the core driving force. Aethir's infrastructure has been integrated into the core production systems of several cutting-edge AI companies. 1. Leveraging Aethir's computing network, Kluster.ai has compressed the patient selection process for clinical trials, which used to take months, to a matter of minutes, significantly enhancing the commercial viability of medical AI. 2. Attentions.ai has built and deployed enterprise-grade custom large-scale models through Aethir, driving the practical application of a no-code AI platform in traditional industries. 3. Mondrian AI, selected for the "KOREA AI STARTUP 100," utilizes Aethir as the underlying computing power for its enterprise AI services. At the gaming industry level, Aethir's production-level delivery capabilities have undergone large-scale commercial validation. 1. SuperScale's test data shows that products based on Aethir's real-time cloud gaming architecture have increased user preference by 43%, click-through rates by 35%, and final conversion rates by 45% compared to traditional download methods. 2. In Reality+'s Doctor Who: Worlds Apart project, Aethir drove an installation conversion rate increase of 201% and an ARPU increase of 61%. Currently, over 400 games have integrated testing through Xsolla, and global leading publishers such as Scopely, Zynga, and Jam City are conducting in-depth evaluations of the system. Institutional Endorsement: The Birth of the Strategic Compute Reserve In October 2025, Aethir completed a $344 million ATH token strategic investment (NASDAQ: POAI) and officially launched the Aethir Digital Asset Treasury (DAT). This mechanism is positioned as the world's first "Strategic Compute Reserve" (SCR) framework, with the goal of integrating decentralized compute assets into enterprise-grade long-term balance sheet systems and exploring a path for deep integration of compute assets with traditional capital markets. As of November 10, 2025, the DAT has disclosed holding 5.7 billion ATH tokens and plans to provide services to AI enterprises through GPU resource deployment, with the income generated used to buy back ATH in a reverse manner, forming a positive loop of "compute supply—enterprise monetization—ecosystem buyback." This structural capital move has positioned Aethir as one of the very few platforms in the DePIN field to receive substantive recognition from traditional capital markets and has for the first time pushed "decentralized compute power" into institutional asset allocation. Enterprise Delivery Capability in a Decentralized Architecture The decentralized GPU network built by Aethir has approached or even surpassed traditional centralized cloud providers in terms of performance, stability, and cost structure. H100-grade GPUs currently support over 90% of global mainstream large-scale model inference tasks, and in the first quarter of 2025, NVIDIA directed 60% of its capacity to enterprise AI customers, further highlighting the strategic scarcity of high-end GPUs. Through a distributed hardware supply system, Aethir bypasses the construction cycle of traditional data centers and supply chain bottlenecks, providing enterprises with near-bare-metal-level compute performance, higher resource utilization, more elastic pricing capabilities, and significantly reduced overall usage costs. DePIN Enters the Era of "Revenue-Driven Realism" According to McKinsey's prediction, global data center construction investment will reach $67 trillion by 2030. At the same time, the DePIN market is expected to grow to a $35 trillion scale by 2028. However, only platforms with real enterprise revenue capability and scalable delivery capability are eligible to build a long-term moat in this race track. About Aethir Aethir is the world's leading decentralized GPU cloud infrastructure platform, committed to providing enterprise-grade computing power services for AI, gaming, and next-generation Web3 applications. Through a distributed GPU network architecture, Aethir transforms global idle computing power into cloud-grade resources that can be instantly accessed by enterprises, building a more open, efficient, and decentralized digital infrastructure.
Author: Techub Hotspot Express Author: Glendon, Techub News Original Title: Ethereum "Interop Layer" Solution: From the Chain Management "Maze" to the "Broad Road" of the Network Era Ethereum is building a highly unified and collaborative chain ecosystem. Last night, the Ethereum Foundation's official blog revealed that the Ethereum Accounts and Chain Abstraction team has proposed the "Ethereum Interop Layer (EIL)" solution, aiming to merge all Layer 2 (L2) networks into a single, unified Ethereum chain at the user perception level. This would make cross-L2 transactions as seamless as single-chain transactions, while retaining the foundations of minimal trust and decentralization. The concept of this interop layer was first proposed at the end of August this year and is currently in the testing and development stage. Previously, Ethereum achieved large-scale expansion through Rollup technology, significantly reducing transaction costs and providing ample block space. This marks the gradual realization of its vision as a global computing platform. However, the development of technology often has two sides, and this evolution has also brought unexpected side effects, the most prominent being the fragmentation of user experience. The Prosperity and Dilemma of L2 The current L2 ecosystem presents a rather complex situation, like scattered islands, with each chain having its own independent gas model, cross-chain bridge system, and even wallet system. When users transfer assets between networks such as Arbitrum, Base, and Scroll, they must manually select the chain, confirm the cross-chain path, and trust third-party liquidity providers. This operational complexity undoubtedly runs counter to Ethereum's original promise of a "seamless, trustless" experience. From a user experience perspective, the consequences of this fragmented experience are quite severe. Ethereum's originally smooth experience advantage has been severely weakened, replaced by the complex operations brought by multiple independent "mini-Ethereums." Users no longer manage simple and direct transactions, but face a pile of L2s. This not only brings operational friction and cognitive burden but also introduces additional trust assumption risks, such as reliance on bridges, relayers, sequencers, etc., and invisibly increases censorship risk. Although before the Ethereum Interop Layer (EIL) proposal, there were already some industry solutions attempting to unify the L2 user experience, unfortunately, most of these solutions deviated from Ethereum's core values. For example, some solutions introduce intermediary institutions for transactions, weakening censorship resistance; others entrust funds to third parties, greatly reducing security, and logic running on third-party servers also undermines transparency and the open-source spirit. Therefore, the accumulation of these contradictions has given rise to a fundamental need: to rebuild a single-chain-like user experience while retaining the scale advantages of L2. So, how does the Ethereum Interop Layer (EIL) resolve this contradiction? The Philosophical Foundation of EIL: Trustless Interoperability Paradigm The Ethereum Interop Layer (EIL) will become the key breakthrough to resolve this contradiction. Its core positioning is as a secure and efficient communication protocol, not a financial tool. The design logic of EIL is to make Ethereum Rollup transactions as seamless as single-chain transactions, allowing users to complete cross-chain transactions with a single signature, without introducing new trust assumptions. Its design philosophy is rooted in two core principles: ERC-4337 account abstraction and the trustless declaration. ERC-4337 account abstraction standardizes account logic, enabling EIL to allow users to initiate cross-chain operations directly from their wallets without relying on relayers or solvers. The specific operating principle is: in EIL, users use ERC-4337 accounts, whose logic is optimized for multi-chain usage scenarios. The wallet generates multiple different UserOps, then authorizes a single signature for the Merkle root of all these UserOps. Each on-chain account's verification part requires (i) a UserOp, (ii) a Merkle branch proving it belongs to a certain Merkle tree, and (iii) a signature for the Merkle tree root. The main advantage of this approach is that it supports hardware wallets, which typically do not support generating N signatures simultaneously (the wallet only requires a single click from the user to complete the signature). On this basis, EIL's design strictly follows the trustless declaration. EIL integrates key logic on-chain and into the user's wallet, ensuring all operations are executed in a verifiable on-chain environment. For example, when a user mints a cross-chain NFT, the wallet automatically merges multi-chain balances and transparently handles gas fees, without entrusting funds to liquidity providers. This design fully safeguards Ethereum's four core values: self-custody (users have full control of assets), censorship resistance (no intermediary or centralized node can block transactions), privacy (smart contracts replace intermediaries, so there is no need to disclose the user's IP address or intent to relayers or solvers), and verifiability (all logic is open-source and auditable). As emphasized by the Ethereum Foundation, from a technical architecture perspective, EIL is equivalent to Ethereum's "HTTP protocol." Just as HTTP unified the server access experience of the early Internet, EIL's goal is to make wallets the universal gateway for users to enter the multi-chain ecosystem, ultimately realizing the vision of "multiple L2s, one Ethereum." At the same time, for users, this is a revolution from "chain management" to "chain perception." The implementation of EIL will completely change the way users interact with the multi-chain ecosystem, mainly reflected in three key aspects: cross-chain transfers, cross-chain minting, and cross-chain swaps. Users can operate with one click, without having to select cross-chain paths or pay extra fees as in the traditional model. The core of this experience lies in the "wallet as portal" design, with cross-chain complexity fully encapsulated. Ultimately, the introduction of EIL will also trigger a cascading effect in the Ethereum ecosystem: Evolution of wallets and DApps: Wallet providers no longer need to develop custom integrations for each new chain. EIL's standardized interface makes multi-chain support a default feature. Developers can focus on user experience innovation rather than redundant cross-chain infrastructure. Rapid onboarding of Rollups: When new networks join the ecosystem, EIL's compatibility design allows them to seamlessly integrate into existing wallet systems, accelerating technological iteration and user growth. Consolidation of trust models: EIL eliminates reliance on off-chain operators, upgrading cross-chain interoperability from the "centralized exchange model" to the "decentralized exchange model." User assets are always held by smart contracts, with no counterparty risk, further strengthening Ethereum's commitment as the "world computer." It is worth mentioning that the emergence of EIL will have a huge impact on the existing market landscape, eliminating the need for intermediaries such as relayers and solvers. As users tend to choose wallet services, this will inevitably lead to a decline in projects that solve or relay L2 transactions, with their transaction volume potentially dropping by more than 80%, or even directly destroying the related field, forcing these intermediaries to quickly adapt and transform, or face extinction. Overall, the significance of the Ethereum Interop Layer (EIL) concept goes far beyond the technical level. It is a return to Ethereum's original intention: a global, open, seamless, and trustless computing platform. When wallets become universal gateways and cross-chain operations are as simple as single-chain transactions, Ethereum's "network era" will truly arrive.
Key Notes The Ethereum Interop Layer aims to provide single-wallet access across all L2s, with no need for bridges or relayers. EIL preserves self-custody and censorship resistance, moving cross-chain logic into verified smart contracts. Developers gain multichain-native wallets that simplify integration and automatically support new rollups. The Ethereum Foundation has unveiled the Ethereum Interop Layer (EIL), a technical initiative intended to unify the user experience of Ethereum’s expanding rollup ecosystem, according to a recent proposal by the Account Abstraction team. EIL promises a wallet-driven solution for seamless activity across Layer 2s , aiming to make Ethereum function as a single chain for users and developers. EIL targets fragmentation in Ethereum’s layer 2 ecosystem The emergence of rollups brought efficiency and affordable transactions, but introduced fragmentation for both assets and user experience. Navigating tokens that reside on Arbitrum, Base, Scroll, or Linea currently requires awareness of each chain’s specifics, the use of bridges and relayers, and constant manual interaction. EIL seeks to eliminate these touchpoints by abstracting away the complexities and consolidating transaction logic inside the user’s wallet. With the interop layer, users can perform operations such as sending tokens, minting NFTs , or swapping assets across Layer 2s with a single click, without needing to identify or interact with individual chains. There are many Layer 2 chains, making it hard to use Ethereum. Source: CoinGecko Preserving Ethereum’s core security principles EIL is built on top of the ERC-4337 account abstraction and guided by the principles outlined in the Trustless Manifesto , unveiled on Nov. 13. The system ensures that all cross-L2 actions are initiated and settled directly from user wallets, without new trust assumptions or intermediaries. Essential values like self-custody, censorship resistance, privacy, and on-chain verifiability are maintained. The trust model stays minimal; users do not rely on third-party bridges or off-chain operators but instead transact under rules encoded in smart contracts and open-source wallet code. Impact on users and developers For users, EIL is designed to feel like “one Ethereum,” removing friction caused by fragmented balances and chain-specific procedures. Transactions such as cross-chain transfers, minting, and swapping are executed as if all assets coexisted on a unified ledger. Wallets become universal portals, with the selection of chains and coordination of asset movement handled invisibly behind the scenes, according to the Ethereum Foundation blog . From a developer perspective, EIL centralizes interoperability within the wallet, bypassing the need for bespoke app-level integrations and accelerating the onboarding of new networks. As a result, dapps and wallets are multichain-native out of the box, enabling a familiar and streamlined experience for both new and existing rollups. Ethereum’s next steps toward unified scalability The Ethereum Interop Layer marks a move from transaction throughput achievements to improvements in interaction simplicity and cross-chain composability. By giving users a window into the whole Ethereum ecosystem through a single wallet interface, EIL aims to restore the sense of unity and trustless operation that characterized Ethereum’s early vision. The initiative, from the team at the Ethereum Foundation, calls on wallet teams, dapp builders, and network designers to participate in its development and help realize the prospect of a seamless, singular Ethereum network.
Original Title: "Circle's Stablecoin Public Chain Arc Testnet Interaction Guide" Original Author: Asher, Odaily Planet Daily Last week, Circle's stablecoin Layer 1 project Arc announced on X platform that its public testnet is now live. Below, Odaily Planet Daily takes you through a "zero-cost" participation in the Arc testnet interaction to receive a token airdrop. Arc: A Layer 1 Dedicated to Stablecoins Launched by Circle Arc is a next-generation EVM-compatible Layer 1 blockchain launched by the "first stock of stablecoins," Circle, aiming to build the economic operating system of the internet, deeply integrating programmable stablecoins with on-chain financial innovation. Arc is designed for financial applications, focusing on global payments, forex, lending, and capital markets, with the goal of providing a secure, low-cost, compliant, and scalable foundational settlement layer for the internet's programmable money. Arc aims to address the three major pain points faced by existing public chains in enterprise and institutional-grade financial applications: inadequate high-frequency transaction performance, lack of privacy and compliance support, and excessive transaction fee volatility. By optimizing its architecture and introducing a stable fee model, Arc will achieve efficient financial-grade transaction experiences and drive stablecoins from being just a "digital dollar" towards becoming the core infrastructure for global payments, lending, forex, and capital markets. Arc Testnet Interaction Guide STEP 1. Add the Arc test network in your wallet, scroll to the bottom of the page, click on Add Arc Testnet in the bottom left corner, and confirm in your wallet popup. STEP 2. Claim testnet test coins, receive both USDC and EURC. STEP 3. Send GM on the Arc testnet, connect your wallet, then find GM on Arc Testnet and click to confirm in your wallet popup. STEP 4. Deploy Contract on Arc Testnet, find Arc Testnet, click Deploy and confirm in the wallet popup. STEP 5. Send GM on Arc Testnet in ZKCODEX Platform, connect wallet, find Arc Testnet, click Send GM, and confirm in the wallet popup. STEP 6. In ZKCODEX Platform, choose Arc Testnet, connect wallet, find Arc Testnet, click Simple Deploy, Token Deploy, NFT Deploy, and confirm in wallet popup. STEP 7. In ZKCODEX Platform, choose Arc Testnet to mint another NFT, click Mint 1 NFT, and confirm in the wallet popup. STEP 8. Register a .arc domain on InfinityName Platform, connect wallet, enter desired domain name, find Arc Testnet Registration, and confirm in the wallet popup. Above is the complete tutorial on interacting with the Arc testnet. If there are any upcoming testnet incentive activities, Odaily Weekly will also be updated as soon as possible. In addition, on October 30, Arc released the first batch of 11 projects built on the public testnet, which are also worth paying attention to, namely: on-chain stablecoin-related protocol ZKP2P, universal encrypted trading platform Sequence, intelligent agent solution interconnection platform Superface, stablecoin wallet infrastructure Blockradar, stablecoin banking service Copperx, crypto API development company Crossmint, cross-border fund sending and management program Hurupay, wallet infrastructure Para, personalized finance platform CFi, zero-knowledge proof-based wallet Hinkal, cross-chain infrastructure Axelar Network.
Original author: Eric, Foresight News On November 1, Vitalik quoted a tweet from the founder of ZKsync about the ZKsync Atlas upgrade and praised ZKsync for doing a lot of "underrated but valuable work for the Ethereum ecosystem." The market quickly responded to Vitalik's comments, with the price of ZK surging more than 2.5 times over the weekend. Tokens in the ZK ecosystem, including ALT (AltLayer), STRK (Starknet), SCR (Scroll), MINA (Mina), and others, also saw significant gains. After learning about the ZKsync Atlas upgrade, we found that what ZKsync has accomplished may indeed have been underestimated. Fast, Small but Expensive ZKP The Ethereum Foundation has been promoting ZKP (Zero-Knowledge Proofs) from an early stage, essentially aiming to solve the problems of slow verification speeds and large amounts of data to be verified. ZKP is essentially a mathematical probability problem. To illustrate its principle with a not entirely accurate example: suppose someone claims to have solved the "Four Color Theorem." How can we confirm this person's solution without fully disclosing it? The zero-knowledge proof approach is to select some parts of the entire graph and prove that no two adjacent regions in these parts have the same color. When the number of selected parts reaches a certain value, the probability that this person has solved the Four Color Theorem reaches 99.99...%. At this point, we have proven the solution without knowing the full details. This is the commonly heard "proving something was done without knowing how it was done" aspect of zero-knowledge proofs. The reason for vigorously promoting ZKP in the Ethereum ecosystem is that, in theory, ZKP can achieve much faster speeds than verifying each transaction individually, and the proof generated itself is very small in data size. The speed advantage comes from the fact that ZKP does not require knowledge of the entire process, only a challenge. For example, to verify an Ethereum block, the current method is for each node to verify basic issues such as whether the execution address of each transaction has sufficient balance. But if only one node verifies each transaction via ZKP and then generates a "proof," other nodes only need to verify the reliability of the "proof" itself. More importantly, the data size of this "proof" is very small, so transmission and verification are extremely fast, and storage costs are lower. The reason this seemingly all-advantageous technology is not widely used is simply because it is too expensive. Although ZKP does not require reproducing the entire process, the challenge itself consumes a lot of computing power. If you stack GPUs as crazily as in the AI arms race, you can achieve faster speeds, but not everyone can afford such costs. However, if algorithmic and engineering innovations can reduce the required computing power and the time to generate proofs under low computing power to a certain level, achieving a balance between "price increases driven by more applications due to technological innovation" and "the cost of setting up nodes and buying GPUs," then this becomes a worthwhile endeavor. Therefore, many ZK concept projects or open-source developers in the Ethereum ecosystem focus on generating ZK proofs at lower costs and faster speeds under low costs. Recently, the Brevis team achieved an average of 6.9 seconds to prove an Ethereum block (99.6% of proof times are less than the current Ethereum block time: within 12 seconds) using only half the cost (64 RTX 5090 GPUs) of the SP1 Hypercube solution, which earned collective praise from the Ethereum community. Although GPU costs still exceed $100,000, at least the proof speed has dropped to the current level without ZKP, and the next task is to reduce costs further. The Atlas Upgrade Achieves 1-Second ZK Finality Perhaps many people don't know that ZKsync's open-source zkVM, ZKsync Airbender, is the fastest zkVM for single GPU verification. According to Ethproofs data, using a single 4090 GPU, ZKsync Airbender achieves an average verification time of 51 seconds at a cost of less than one cent, both of which are the best results among zkVMs. According to data provided by ZKsync, excluding recursion, Airbender uses a single H100 and the ZKsync OS storage model to verify the Ethereum mainnet with an average time of 17 seconds. Even including recursion, the total average time is only about 35 seconds. ZKsync believes this is much better than needing dozens of GPUs to achieve verification within 12 seconds. However, since there is currently only data for two GPUs averaging 22.2 seconds, the actual performance is still inconclusive. And all this is not solely the credit of Airbender; algorithmic and engineering optimizations are only part of it. The deep integration with the ZKsync technology stack is the key to maximizing results. More importantly, it demonstrates that real-time proof of the Ethereum mainnet using a single GPU is possible. At the end of June, ZKsync launched Airbender, and on the penultimate day of the National Day holiday, the Atlas upgrade went live. This upgrade, which integrates Airbender, has significantly improved ZKsync's throughput, confirmation speed, and costs. In terms of throughput, ZKsync optimized the sequencer at the engineering level: by using independent asynchronous components to minimize the consumption caused by synchronization; separating the state required by the virtual machine, the state required by the API, and the state required to generate zero-knowledge proofs or verify zero-knowledge proofs on L1, thereby reducing unnecessary overhead of components. According to ZKsync's field tests, TPS for high-frequency price updates, stablecoin transfers in payment scenarios, and native ETH transfers reached 23k, 15k, and 43k, respectively. Another huge qualitative leap comes from Airbender, which helps ZKsync achieve 1-second block confirmation and a single transfer cost of $0.0001. Unlike verifying mainnet blocks, ZKsync only verifies the validity of state transitions, so the computational load is much less than verifying mainnet blocks. Although transactions with ZK finality still require mainnet verification to achieve L1 finality, ZK verification already indicates the validity of the transaction, and L1 finality is more of a procedural guarantee. In other words, transactions executed on ZKsync only need ZKP verification to be fully confirmed as valid, and with significantly reduced costs, ZKsync has achieved, in their own words, application scenarios that only Airbender can bring: First, naturally, are applications such as on-chain order books, payment systems, exchanges, and automated market makers. Airbender enables the system to verify and settle at extremely fast speeds, reducing the risk of rollbacks for these on-chain applications. The second point is something that many L2s currently cannot achieve: supporting public and private systems (such as ZKsync's Prividiums) to interoperate without third parties. Prividiums is infrastructure launched by ZKsync to help enterprises build private chains. For enterprises, the requirements for blockchain are fast settlement and privacy. Fast settlement needs no further explanation, and the inherent privacy of ZKP allows enterprise private chains to verify transaction validity when interoperating with public chains without exposing the ledger information of the chain itself. The combination of the two even meets the settlement time requirements for on-chain securities and forex trading under compliance regulations. This may also be the reason why ZKsync has become the second largest tokenized RWA asset issuance network after Ethereum. ZKsync also proudly states that all of this is only possible under the Atlas upgrade: the sequencer provides low-latency transaction packaging, Airbender generates proofs within one second, and then Gateway verifies and coordinates cross-chain messages. Connecting L1 and L2 As Vitalik retweeted, ZKsync founder Alex believes that after the Atlas upgrade, ZKsync has truly achieved connectivity with the Ethereum mainnet. Now, ZKsync's transaction final confirmation time (about 1 second) is shorter than the Ethereum mainnet block time (average 12 seconds), which means that institutional and RWA transactions conducted on ZKsync are essentially the same as those on the Ethereum mainnet, just waiting for mainnet confirmation. This means that ZKsync does not need to build a liquidity center on L2; it can directly use the mainnet's liquidity, because ZK Rollup's cross-chain with the mainnet does not require a 7-day challenge period like OP Rollup, and the Atlas upgrade further accelerates the process. This improves the L2 fragmentation issue recently discussed in the Ethereum community. L2 and L1 are no longer two separate chains, but are connected as one through fast confirmation and verification, and for the first time, L2 can truly be called a "scaling network." Recall that when ZKsync and Scroll first launched on the mainnet, transaction confirmation speed and gas fees were the same as or even higher than the mainnet. This was essentially because there had not yet been systematic algorithmic and engineering optimizations for ZKP, resulting in slow verification and high costs, which at the time triggered a trust crisis for ZK Rollup. Today, Optimism and Arbitrum are gradually transitioning from OP Rollup to ZK Rollup (or a combination of both), and the further improvements in cost and speed by ZKsync and other ZK Rollups, as well as Scroll's decentralized ZKP, have turned what was once considered "nonsense" into results worth looking forward to. From being criticized by everyone to becoming highly sought after, ZK has ushered in a new dawn. After the sequencer and cross-chain bridge multisig are fully decentralized, perhaps it will truly be possible to achieve what Dragonfly Managing Partner Hasseb Qureshi called "can't be evil."
For a single GPU, Airbender not only has the fastest verification speed but also the lowest cost. Written by: Eric, Foresight News On November 1, Vitalik quoted a tweet from the founder of ZKsync regarding the ZKsync Atlas upgrade and praised ZKsync for doing a lot of "underrated but highly valuable work for the Ethereum ecosystem." The market quickly reacted to Vitalik's comments, with ZK prices surging more than 2.5 times at their peak over the weekend. Tokens in the ZK ecosystem, including ALT (AltLayer), STRK (Starknet), SCR (Scroll), MINA (Mina), and others, also saw significant gains. After learning about the ZKsync Atlas upgrade, we found that what ZKsync has accomplished may indeed be underestimated. Fast, Small but Expensive ZKP The Ethereum Foundation has promoted ZKP (Zero-Knowledge Proofs) from early on, essentially aiming to solve the problems of slow verification speed and large amounts of verification data. ZKP is essentially a mathematical probability problem. To give a not entirely accurate example to roughly explain its principle: Suppose someone claims to have solved the "Four Color Theorem." How can we confirm this person has indeed solved it without fully disclosing their solution? The zero-knowledge proof approach is to select some parts of the entire graph and prove that in these parts, no two adjacent regions have the same color. When the number of selected parts reaches a certain value, it can be shown that the probability this person has solved the Four Color Theorem is 99.99...%. At this point, we have proven that they have "indeed solved the Four Color Theorem" without knowing the full details. This is what people often refer to as "proving something was done without knowing how it was done"—the essence of zero-knowledge proofs. The reason for vigorously promoting ZKP in the Ethereum ecosystem is that, in theory, ZKP's speed ceiling is much higher than that of proving each transaction individually, and the amount of data generated by the proof itself is very small. The speed advantage comes from the fact that ZKP does not require knowledge of the full picture, only challenges. For example, to verify an Ethereum block, the current method is for each node to verify basic issues such as whether the execution address of each transaction has sufficient balance. But if only one node verifies each transaction using ZKP and then generates a "proof," other nodes only need to verify the reliability of the "proof" itself. More importantly, the data size of this "proof" is very small, so its transmission and verification are extremely fast, and the cost of storing the data is lower. The reason why this seemingly all-advantageous technology is not widely used is simply because it is too expensive. Although ZKP does not require reproducing all processes, the challenge itself consumes a lot of computing power. If you stack GPUs like in the AI arms race, you can achieve faster speeds, but not everyone can afford such costs. However, if algorithmic and engineering innovations can reduce the required computing power and the time to generate proofs under low computing power to a certain level, achieving a balance between "price increases driven by more applications introduced through technological innovation" and "the cost of setting up nodes and purchasing GPUs" for Ethereum, then this becomes a worthwhile endeavor. Therefore, many ZK concept projects or open-source developers in the Ethereum ecosystem focus on generating ZK proofs at lower costs and faster speeds under low costs. Not long ago, the Brevis team achieved an average of 6.9 seconds to prove an Ethereum block (99.6% of proof times were less than the current Ethereum block time: within 12 seconds) using only half the cost (64 RTX 5090 GPUs) of the SP1 Hypercube solution. This achievement earned collective praise from the Ethereum community for this very reason. Although GPU costs still exceed $100,000, at least the proof speed has dropped to the current level without ZKP. The next task for everyone is to reduce the cost. The Atlas Upgrade Achieves 1-Second ZK Finality Perhaps many people don't know that ZKsync's open-source zkVM, ZKsync Airbender, is the fastest zkVM for single GPU verification. According to Ethproofs data, using a single 4090, ZKsync Airbender's average verification time is 51 seconds, with a cost of less than one cent—both are the best results among zkVMs. According to data provided by ZKsync itself, excluding recursion, Airbender uses a single H100 and the ZKsync OS storage model to verify the Ethereum mainnet with an average time of 17 seconds. Even including recursion, the total average time is only about 35 seconds. ZKsync believes this is much better than needing dozens of GPUs to achieve verification within 12 seconds. However, since there is currently only data for two GPUs with an average of 22.2 seconds, the actual performance is yet to be determined. All of this is not solely due to Airbender; algorithmic and engineering optimizations are only part of the story. The deep integration with the ZKsync tech stack is the key to maximizing results. More importantly, it demonstrates that real-time proof of the Ethereum mainnet using a single GPU is possible. At the end of June, ZKsync launched Airbender, and on the penultimate day of the National Day holiday, the Atlas upgrade went live. This upgrade, which integrated Airbender, significantly improved ZKsync's throughput, confirmation speed, and cost. In terms of throughput, ZKsync optimized the sequencer at the engineering level: by using independent asynchronous components to minimize the overhead caused by synchronization; separating the state required by the virtual machine, the state required by the API, and the state required to generate or verify zero-knowledge proofs on L1, thereby reducing unnecessary component overhead. According to ZKsync's field tests, TPS for high-frequency price updates, stablecoin transfers in payment scenarios, and native ETH transfers reached 23k, 15k, and 43k, respectively. Another huge qualitative leap comes from Airbender, which helped ZKsync achieve 1-second block confirmation and a single transfer cost of $0.0001. Unlike verifying mainnet blocks, ZKsync only verifies the validity of state transitions, so the computation is much less than verifying mainnet blocks. Although transactions with ZK finality still require mainnet verification to achieve L1 finality, ZK verification already confirms the validity of the transaction, and L1 finality is more of a procedural guarantee. In other words, transactions executed on ZKsync only need ZKP verification to be fully confirmed as valid, and with the greatly reduced cost, ZKsync has achieved, in their own words, application scenarios that only Airbender can bring: First, naturally, are applications such as on-chain order books, payment systems, exchanges, and automated market makers. Airbender enables the system to verify and settle at extremely fast speeds, reducing the risk of rollbacks for these on-chain applications. The second point is something that many current L2s cannot achieve: supporting interoperability between public and private systems (such as ZKsync's Prividiums) without third parties. Prividiums is ZKsync's infrastructure to help enterprises build private chains. For enterprises, the requirements for blockchain are fast settlement and privacy. Fast settlement needs no further explanation, and the inherent privacy of ZKP allows enterprise private chains to verify transaction validity when interoperating with public chains without exposing the ledger information of the chain itself. The combination of the two even meets the settlement time requirements for on-chain securities and forex trading in compliance regulations. This may also be why ZKsync has become the second-largest tokenized RWA asset issuance network after Ethereum. ZKsync is also proud to state that all of this is only possible under the Atlas upgrade: the sequencer provides low-latency transaction packaging, Airbender generates proofs within one second, and the Gateway verifies and coordinates cross-chain messages. Bridging L1 and L2 As Vitalik retweeted in this tweet, ZKsync founder Alex believes that after the Atlas upgrade, ZKsync has truly bridged to the Ethereum mainnet. Now, ZKsync's transaction final confirmation time (about 1 second) is shorter than the Ethereum mainnet block time (average 12 seconds), which means that institutional and RWA transactions conducted on ZKsync are essentially the same as those on the Ethereum mainnet, just waiting for mainnet confirmation. This means ZKsync does not need to repeatedly establish liquidity centers on L2; it can directly use mainnet liquidity. This is because ZK Rollup's cross-chain mechanism with the mainnet does not require a 7-day challenge period like OP Rollup, and the Atlas upgrade further accelerates the process. This improves the L2 fragmentation issue recently discussed in the Ethereum community. L2 and L1 are no longer two separate chains but are connected as one through rapid confirmation and verification. For the first time, L2 can truly be called a "scaling network." Recall when ZKsync and Scroll first launched on mainnet, transaction confirmation speed and gas fees were the same as or even higher than the mainnet. This was essentially because there had not yet been systematic algorithmic and engineering optimizations for ZKP, resulting in slow verification and high costs, which at the time even triggered a trust crisis for ZK Rollup. Today, Optimism and Arbitrum are gradually transitioning from OP Rollup to ZK Rollup (or a combination of both), and the further improvements in cost and speed by ZKsync and other ZK Rollups, as well as Scroll's decentralized ZKP, have turned what was once considered "nonsense" into something worth looking forward to. From being criticized by everyone to becoming highly sought after, ZK has ushered in a new dawn. After the sequencer and cross-chain bridge multisig are fully decentralized, perhaps the "can't be evil" vision described by Dragonfly Managing Partner Hasseb Qureshi can truly be realized.
BlockBeats News, November 2, according to market data, driven by ZKsync's single-day increase of over 88%, some tokens in the ZK sector and L2 sector have seen significant gains today, including: ALT's 24-hour increase reached 22.3%; STRK's 24-hour increase reached 17.1%; SCR's 24-hour increase reached 17.5%; MINA's 24-hour increase reached 43.2%. Previous reports indicated that Vitalik has been continuously following the progress of ZKsync upgrades and has interacted with the project multiple times, praising ZKsync for its undervalued but valuable contributions within the Ethereum ecosystem.
Foresight News reported that Scroll has announced the launch of a points program aimed at rewarding early adopters. By purchasing, holding, and using USX, Scroll will automatically track and calculate points, with no manual claiming required.
Original Title: How to do a good research? Original Author: le.hl Original Translation: Luffy, Foresight News As an investor, the easiest way to lose money is to blindly follow the crowd, knowing nothing about the project, and simply enter the market based on others' advice. I have had such an experience, so I am sharing my project research experience here. If you are a cryptocurrency newbie and need a reliable practical method, this article is prepared for you. Define the Project Narrative Narrative is one of the core elements of the cryptocurrency industry, and market trends often revolve around narratives. If you want to invest in a project, you must first understand the narrative logic behind it. If the project is still stuck in outdated narratives like the metaverse, GameFi, it is likely to have a hard time succeeding. I usually look up the project narrative on certain well-known platforms. Steps: 2. Enter the project name; 3. Scroll down to the "Tags" section to view the project's narrative. After understanding the narrative, the next step is to identify the leading project in that sector. Observe its recent trading volume changes, assess the dynamics; and evaluate if the project you are interested in has the ability to compete with the leader. Remember, investing in the leader's competitors often has a better opportunity than chasing an already skyrocketed leader. Choosing a currently trending narrative (such as AI, prediction markets, InfoFi, etc.) is the best path to profitability. Verify the Project's Investors Today, many people are averse to the term 'venture capital' and prefer projects that self-fund. However, the fact is: If a project lacks excellent products, has a mediocre team, and is not the leader in any narrative, it needs reliable investors to drive its development. My most frequently used platform to look up project investors is CryptoFundraising, which can display all key information about a project's investors, team, social accounts, official website, and more, all completely free of charge. Operating steps: 2. Search for the target project; 3. Check the funding amount and venture capital level. I have found that projects with lower funding amounts, supported by only 2-3 VCs, usually perform better than those with 20 or more VCs. It’s like a cake being divided among too many people; the team needs to get approval from all VCs when making decisions, which can be restrictive. The VC level is also crucial. I personally prefer projects supported by VCs such as Coinbase VC, a16z, Polychain Capital, Paradigm, and GSR. Examine Project Social Dynamics This step is very crucial. If a project disables comments or frequently changes its social account nickname, it’s best to avoid it directly. The "number of well-known individuals who follow each other" is also worth considering: If there are over 20 industry celebrities following the project, it is usually a positive sign. To verify the legitimacy of a project, you can also use some network platforms with information verification features. 2. Install the Chrome browser extension; 3. Make the following settings: If a project has negative feedback, block it to avoid seeing related disruptive data on X. No invitation code is needed, you can also use this plugin for free. Deep Research Core Dimensions Founder I prefer to invest in projects where the founder is actively involved in the cryptocurrency community on a daily basis and engages with the community. Outstanding founders have a strong belief in their project and are willing to admit mistakes. Avoid founders who claim that the community is everything but then behave arrogantly, are disconnected from users, or are anonymous. The founder's actions often determine the project's direction after launch. Product Usability is the key metric I value the most. Only a simple, user-friendly product can attract real users and generate revenue. Even the most amazing concept (like "Quantum Blockchain Solves Global Hunger") will not receive attention if it is complex to operate and challenging to use. Tokenomics For projects with issued tokens, be cautious if the following situation occurs: distributing tokens to groups not related to the project (e.g., distributing tokens to platforms like Binance Alpha for short-term hype without receiving any substantial support). This behavior often leads to a failed Token Generation Event (TGE) and a dismal price trend thereafter. A tokenomics model does not need to allocate all tokens to the community, but it must establish a clear, transparent unlocking schedule for all stakeholders (including the team). Team transparency is always paramount. To investigate a project's tokenomics and unlocking schedule, you can use certain progress tracking tools to search for the target project; examine the price trend after the project's last token unlock to assess the unlocking's impact on the price.
Token unlocks over $431 million will enter circulation . Projects such as ZRO, SCR, and MBG feature significant releases. SOL, WLD, and TRUMP lead the daily unlocks. Between October 20 and October 27, several major token unlocks event are scheduled, totaling over $431 million in value. Tokenomist data shows a combination of single large unlocks and continuous linear unlocks across a variety of blockchain projects. These events represent significant movements in circulating supply, potentially influencing short-term market liquidity and token distribution patterns. Single Large Unlocks Exceeding $5 Million Among the single large token unlocks , a summarzed report by Wu Blockchain reveals that ZRO leads with 25.71 million tokens valued at $44.48 million, representing 7.86% of its total supply. XPL follows with 88.89 million tokens worth $37.41 million, unlocking 4.97% of its supply. MBG records 15.84 million tokens valued at $17.09 million, amounting to 11.97% of supply. Source: X SCR registers a substantial unlock of 82.50 million tokens, valued at $14.79 million, which equals 43.42% of its available supply, marking one of the highest proportional releases during this period. Additional single unlocks include SOON with 15.21 million tokens valued at $14.47 million, equivalent to 4.52% of its supply. UDS releases 3.97 million tokens worth $9.89 million, accounting for 2.85%. KAITO’s 8.35 million tokens equal $9.15 million, unlocking 3.06% of its supply. Project H will release 62.50 million tokens worth $8.95 million. SAHARA adds 84.27 million tokens worth $6.56 million, while VENOM unlocks 59.26 million tokens valued at $6.18 million, consisting 3.60% and 2.23% of their respective supplies. These single-event releases reflect concentrated liquidity injections within a short timeframe. Continuous Linear Token Unlocks Across Major Tokens The same week includes continuous daily unlocks exceeding $1 million in value. SOL leads this group, releasing 496.02 thousand tokens valued at $95.78 million, representing 0.09% of the circulating supply. WLD follows with 37.23 million tokens worth $34.50 million, equal to 1.68%. TRUMP unlocks 4.89 million tokens valued at $29.44 million, covering 2.45% of supply. DOGE records 96.78 million tokens worth $19.48 million, adding 0.06% to circulation. AVAX contributes 700 thousand tokens valued at $14.62 million, marking 0.16% of its supply. IP unlocks 2.32 million tokens worth $13.18 million, equivalent to 0.72%.Further linear releases include ASTER with 10.28 million token unlocks valued at $12.55 million, TAO with 25.20 thousand tokens worth $11.10 million, and ETHFI with 8.53 million tokens valued at $9.47 million. TIA, SUI, and DOT, respectively, release 8.07 million, 2.71 million, and 2.30 million tokens, each valued between $7 million and $8.5 million, representing between 0.70% and 0.98% of supply.
The market last week was far from calm. After the epic leverage liquidation triggered by the macro tariff "black swan" event the previous weekend (October 10), the entire crypto industry spent last week (October 13-17) struggling to recover from the shock. Bitcoin fell from a high of $126,000 to below $107,000 at one point, wiping out billions in capital, and the panic in the market has not yet fully dissipated. This week, just as the market has stepped out of the "intensive care unit" (ICU), it will immediately face two opposing but equally powerful forces: one is the "internal game" from Washington, which concerns the long-term future of the industry; the other is the "external shock" from the macro economy, which determines the short-term volatility at hand. This is a week where "long-term regulatory narratives" and "short-term macro data" collide fiercely, as the market tries to find a new balance amidst the ruins. Focus 1: Washington's Banquet? Crypto Giants Gather in the Senate This Wednesday, Washington will host the highest-level "closed-door roundtable" in the crypto industry in recent years. According to crypto journalist Eleanor Terrett, CEOs or chief legal officers from almost all leading U.S. crypto companies—including Coinbase, Chainlink, Galaxy, Kraken, Uniswap, Circle, Ripple, and a16z crypto—will meet with pro-crypto Democratic senators. The topic of this meeting goes straight to the core—"market structure legislation and future development direction." This is by no means an ordinary PR meeting. After a long regulatory tug-of-war, this is more like a "showdown." Industry giants are trying to present a unified and strongest voice before the regulatory framework is finalized. The outcome of this meeting may directly influence the legislative tone of the United States toward crypto assets (especially DeFi and stablecoins) in the coming years. Long-term investors in the market are holding their breath in anticipation. Focus 2: Macro Super Friday and the Federal Reserve's "Crypto Debut" If Washington decides the "long term," then this week's macro data determines the "here and now." First, due to the government shutdown delay, the U.S. September CPI data originally scheduled for release last week will be announced on the same day as the October Markit Manufacturing PMI data (this Friday, October 24, UTC+8). This creates a rare "macro super Friday." The market generally expects CPI to remain high, with core inflation still stubbornly sticky. These two data points are the most crucial pieces of the puzzle for the Federal Reserve's next rate-setting meeting, and any numbers that exceed expectations could trigger short-term market panic or euphoria on Friday. What the crypto industry should be even more wary of is that the Federal Reserve itself is also "entering the game." This Tuesday (October 21, UTC+8), the Federal Reserve will hold a meeting on "payment innovation." The topics are strikingly close to the core of crypto: stablecoins, artificial intelligence, and tokenization. Federal Reserve Governor Christopher Waller will deliver the opening speech. This is almost the first time the Federal Reserve has so intensively discussed these emerging topics at an official meeting. Are they preparing to embrace, regulate, or "incorporate" them? Waller's wording will be an important indicator for interpreting future regulatory attitudes, especially stablecoin policy. Focus 3: Earnings Season and Internal Market Selling Pressure Beyond the main themes of regulation and macroeconomics, two "noise sources" are equally noteworthy. First, the earnings season in both China and the U.S. is reaching its climax. This week, Tesla, Intel, Netflix, as well as A-share companies CATL and iFlytek, will release their results. In the current fragile market sentiment, the performance of these "bellwether" companies in the tech and AI sectors will directly affect the Nasdaq's trend, which in turn will strongly transmit to the crypto market, where risk appetite is highly aligned. Second is the most direct "selling pressure test" within the market. According to Token Unlocks data, this week will see a large one-time token unlock, with a total value exceeding $50 million. The pressure on several major tokens is not to be underestimated: LayerZero (ZRO): Unlocking about $43.19 million (7.86% of circulating supply) on October 20 (UTC+8) Scroll (SCR): Unlocking about $14.23 million (43.42% of circulating supply) on October 22 (UTC+8) MBG By Multibank Group (MBG): Unlocking about $17.04 million (11.97% of circulating supply) on October 22 (UTC+8) Such intensive unlocking, especially during the sensitive period before macro data releases, will pose a severe test to the liquidity absorption capacity of tokens like ZRO and SCR. Summary In summary, this is by no means a calm week. On Monday (today), a series of data including China's GDP will set the "opening tone" for global risk assets this week; on Tuesday, the Federal Reserve's "payment innovation" meeting will test the boundaries of regulation; on Wednesday, crypto giants will "break through" in Washington; finally, all emotions will be unleashed on Friday with the U.S. "CPI+PMI" data combo. Investors need to fasten their seat belts—this is a week that will test resolve and is also full of uncertainties.
Original Article Title: How Polymarket Insiders Can Help You Win Almost Every Time Original Article Author: The Smart Ape, LBank Partner Original Article Translation: AididiaoJP, Foresight News How to Find Polymarket Insiders Polymarket is a large and rapidly growing market, with a trading volume exceeding $15 billion since its launch. What is fascinating is that users can employ many advanced strategies to profit, such as arbitrage, providing liquidity, capturing discounts, high-frequency trading, and more. It is still an early and evolving market, now entering a regulatory phase, which means there are still plenty of opportunities. But one method is still largely untapped: insider analysis. Polymarket is an open platform, meaning anyone can create markets on anything. Some markets are entirely based on public information, such as "Who will win the next World Cup?," while others involve events where a small number of people already know the answer, like "Who will receive the next Nobel Peace Prize?" In the Nobel Prize market, the committee responsible for selecting the Nobel Prize laureates obviously knows the results earlier than anyone else, and some of them may quietly use this information to trade on Polymarket. If you can track the movements of these insiders, you can essentially bet on the correct outcome almost with certainty because insiders know exactly what will happen. Another example is "Monad Airdrop by October 31st." The project team and those closely related to the project already know if it will happen, so anyone able to track those wallets has a significant advantage. There are several ways to detect potential insider activity. The simplest method is to use Hashdive(dot)com, which is currently the best Polymarket analytics tool, providing extensive metrics and data for each market. · First, select a market where insider activity may be taking place, such as the Monad airdrop. · Click into that market, and you will see a detailed page including analysis and metrics. · Scroll down to the "Possible Insiders" section. Let's take the first trader on the list as an example: They have wagered $100,000 on "No," and this is their only trade in the market. This is highly suspicious—a new wallet putting in a large sum on a single market. This individual is likely a member of the Monad team or closely associated with them. The goal is not to focus on individual traders but to analyze the collective activity of a group. Some may be true insiders while others may just be following along; the key is in the overall pattern. In this example, almost all top traders have wagered "No." The top eight wallets are all on the same side, each using a new wallet and holding a large position in just one or two markets. This is a clear signal: insiders seem confident that there will be no Monad airdrop by October 30. Currently, the price for the "No" side is around $0.83, implying a potential guaranteed return of nearly 17% by October 30. Some markets do not have a "Possible Insiders" section, which is perfectly normal. For example, the "Bolivia Presidential Election" market is unlikely to have real insiders because in a neck-and-neck competition, nobody truly knows how the people will vote. Therefore, the key is to choose markets where insider information may exist and to track insider movement early. The earlier you spot these movements, the higher your potential profit. If you wait too long, more insiders will join, prices will shift, and your profit opportunity will diminish. Your advantage entirely depends on how early you can discover them. Nobel Prize Case Study One of the best examples of this strategy in actual application is the market: “2025 Nobel Peace Prize winner.” Some traders apparently had the information 9 hours before the official announcement. In a matter of seconds, Maria Machado’s odds surged from 3.6% to 70%, well before the results were made public. This was evidently an insider move, with someone leaking the decision ahead of time. Some traders saw returns 20 times their investment, either because they followed the insider’s lead or because they were the insiders themselves: · Debased turned $2.5K into $75K · CannonFodders turned $900 into $30K · Gopfan 2 turned $700 into $26K They all entered the market as soon as Maria Machado’s odds mysteriously began to soar. These individuals could be members of the Nobel Committee, closely related to the committee, or even investigative journalists who discovered the leak. One thing is certain: some had reliable information 9 hours before the official announcement. When the Polymarket market’s odds jumped from 3% to 70% within minutes, the presence of insiders is undeniable. The Norwegian authorities even launched an insider trading investigation into the matter. Reportedly, they focused on wallet ‘6741,’ which bet $50K a few hours before the results were announced. That wallet had only transacted once and only on this market, which immediately raised suspicions. Why Having Insiders Is Actually a Good Thing Initially, you might think insiders are detrimental to Polymarket, but in reality, they helped it achieve its true purpose. Polymarket's true mission is not about making money or losing money, but about revealing the collective truth about future events. The more insiders there are, the more accurate the price, and the more reliable the information the market provides. Take the Nobel Prize, for example. I don't need to wait for the official announcement; Polymarket has already told me who the winner is. In this sense, Polymarket beats all major media outlets to the punch, which is precisely why it's so powerful. Insiders with reliable information help correct pricing errors and indirectly pass on this knowledge to everyone else through price changes. It's an ultra-efficient mechanism for information dissemination. Without insiders, prices reflect only views and speculation. With them, prices reflect hidden yet real facts. That's why some economists, like the creator of the concept of "prediction markets," believe insider trading is beneficial in this case: It narrows the gap between belief and reality. It also creates a truth incentive system: If insiders trade based on true information, they profit. If they are wrong or lie, they lose. There's no motivation to spread fake news because they'll pay the price for being wrong. Most importantly, these insiders don't harm others. Unlike in the token market, where insiders dump tokens on retail traders, prediction markets are voluntary, and traders are aware of the risk of information asymmetry. It's a game of probability, not long-term investment. Therefore, as long as the rules are clear, insiders can improve the accuracy of predictions without causing systemic unfairness. Tools to Track Them Here are some of the most useful tools for analyzing Polymarket data. This list is not exhaustive, as new tools are constantly emerging. Dune Dashboards: Dozens of Polymarket dashboards, some are overarching (volume, users, trades), others are specialized (insiders, airdrop tracker, whales, etc.). PolymarketAnalytics(dot)com: One of the most comprehensive tools. It lets you track market traders in real-time, discover top alerts, whales, smart money, and analyze performance. Hashdive(dot)com: Another powerful analytics platform. Each market page includes in-depth metrics, as well as a new "Insiders" section to help you identify potential insider traders.
Before you start trading perpetual contracts, you must understand that this is a zero-sum game. Written by: Eric, Foresight News HyperLiquid co-founder Jeff Yan shared some thoughts early yesterday morning regarding HyperLiquid's performance during the weekend market crash, mentioning, "This is the first time in over two years of HyperLiquid's operation that cross-margin auto-deleveraging (ADL) has been triggered." Auto-deleveraging, or ADL, is something that many CEXs try to avoid at all costs, and it's also a frequent topic of complaints among users on X. Of course, it's easy to understand why people complain: auto-deleveraging is when the exchange forcibly closes users' positions, causing them to "make less money." We often see posts on X criticizing exchanges for triggering ADL, which prevents investors from realizing the paper profits they see on illiquid altcoin contracts. Extreme market conditions always prompt new reflections. This time, despite the sharp drop, HyperLiquid experienced no issues with trading or withdrawals, while some perp DEXs were forced to suspend withdrawals, leading many to reconsider the real value of ADL. Insurance Fund and ADL Since GMX, protocol vaults that allow external deposits have almost become standard for perp DEXs, which are essentially the on-chain version of an "insurance fund." For example, during last week's extreme downturn, a large number of leveraged long positions were liquidated, but there was not enough buying interest in the market to absorb them (the buying from active longs and short covering was not enough to offset the market sell orders caused by liquidations). If left unchecked, this would result in some long positions' margin being unable to cover their losses. This is when the insurance fund comes into play, maintaining market balance by absorbing the market orders caused by liquidations at the liquidation price of certain positions. Afterwards, when prices stabilize and new investors enter, these positions can be gradually closed to release the funds locked in them. HyperLiquid's insurance fund is HLP, and Jeff stated that to optimize risk management, HLP is divided into many sub-pools, with only one sub-pool taking over during each liquidation. The triggering of the insurance fund essentially means the market is moving towards an extreme, and the lack of orders in the opposite direction also indicates that the trend is so obvious that even the most reckless gamblers hesitate to go against it. If the insurance fund is about to be exhausted and still can't absorb the ongoing liquidations, then the dreaded but necessary ADL must be used. According to my research, there are two main ADL mechanisms in the market. One is to start auto-deleveraging in advance when the insurance fund's available capital drops below a certain threshold to minimize overall system risk. The other is to forcibly close profitable positions at the liquidation price of losing positions after the insurance fund is depleted and negative equity occurs, until the system is rebalanced. According to HyperLiquid's documentation, it uses the second method, meaning that the first cross-margin ADL in two years indicates that HLP's funds were already or nearly exhausted. Some CEXs use the first mechanism. While some smaller exchanges may maliciously reduce the profits of winners, more often, the complexity of various forms of circular collateral and lending in CEXs means that, in extreme conditions, the intensity of liquidations can be even greater than what is seen in the contract market alone, so a certain margin for error is needed. When ADL occurs, there are certain rules for who gets forced out first, usually considering profit, leverage, and position size. In other words, the largest, most profitable, or highest-leverage whales are the first to be removed from the market. Doug Colkitt, founder of DEX Ambient Finance on Scroll, commented on X about ADL: "The beauty of contract markets is that they are all zero-sum games, so the entire system can never go bankrupt. Not even a single bitcoin is truly devalued; it's just a bunch of boring cash. Like thermodynamics, value is never created or destroyed in the system." Zero-sum is the fundamental premise of this game. Once you truly understand this, you may gain a deeper insight into the financial games you are participating in. How Should You Accept "Making Less Money"? As mentioned earlier, whenever ADL is discussed, users almost always complain. In the eyes of most users, every liquidation or loss is a real hit, but profits are cut short by the system due to insufficient liquidity, which feels extremely unfair. Users feel that since the money lost is taken by other users, market makers, or even the exchange, then when they profit, the others should also pay up accordingly. So you need to understand the true meaning of "zero-sum game." In the perpetual contract market, ignoring fees, the amount of money lost always equals the amount of money won. Your opponents are other retail traders, institutions, market makers, and the exchange's trading team. When even the insurance fund, which is designed purely for user experience and not for profit, can barely cover the losses, it means that no other participant is willing to take the other side of your trade anymore. At this point, if you expect a profit-driven company to use its own uncertain losses to guarantee your certain gains, the likelihood is almost zero. In some cases, such as when unfounded FUD causes a token to drop sharply for a short time, even if ADL is triggered, the exchange may take over your profitable position out of confidence in the project's future (possibly by temporarily freezing profits through withdrawal or redemption restrictions). If you only know that perpetual contracts have isolated and cross-margin modes, know about funding rates, and know how to calculate leverage and liquidation prices, then you are not yet ready to participate in this game. "Zero-sum game" means that when your profits exceed the system's capacity, you cannot take a single penny from outside the system (i.e., the exchange itself). In other words, your profits always have an implicit ceiling, but if you started shorting bitcoin at $1 and bitcoin continues to rise in the long term, your losses have no upper limit. Of course, we can also interpret this optimistically: when you encounter ADL, it means there are no longer enough counterparties in the market to hedge your position, indicating that you chose the right direction before the trend started and held on until everyone agreed it was the right direction; it also means the exchange's insurance fund can no longer or is unwilling to take on more liquidated orders. If the exchange is not maliciously reducing your profits, then congratulations—you have already earned the maximum profit allowed by the rules and tolerated by a profit-driven company, making you the ultimate winner of this game.
Foresight News reported that the liquidity allocation protocol Turtle has announced its Genesis airdrop and released the TURTLE allocation details. Of the total, 11.9% will be distributed to contributing users: including limited partners and participants (9%), TAC Vault deposit bonus (1.2%), user referrals (0.7%), early users/Discord OG roles (0.3%), Turtle liquidity leaderboard (0.2%), dealer referrals (0.2%), Kaito leaderboard (0.1%), BeraChain NFT (0.1%), Scroll NFT (0.1%), and several other categories. Protocols and partners integrated into Turtle activities and infrastructure will receive a 2% allocation. Turtle stated that Sybil activities and bot accounts have been removed from the system. Airdrop allocations of 1,700 TURTLE or less will be fully unlocked at TGE with no vesting required; for allocations exceeding 1,700 TURTLE, 70% can be claimed immediately at TGE, while the remaining 30% will vest linearly over 12 weeks. After the airdrop launches, holders can stake TURTLE for sTURTLE to gain delegation and voting rights to participate in protocol governance. The airdrop query feature will be launched soon.
Vitalik Buterin considers the Fusaka upgrade and its PeerDAS technology as a decisive turning point for the future of Ethereum. By revolutionizing blockchain data management, this innovation could well solve the complex equation between scalability and decentralization. Read us on Google News In brief Vitalik Buterin states that PeerDAS is the central element of Ethereum’s Fusaka upgrade. This technology allows nodes to verify blocks without storing the entire data thanks to erasure coding. Ethereum has just reached six blobs per block for the first time, revealing growing demand from rollups. Ethereum has just reached six blobs per block for the first time, revealing growing demand from rollups. PeerDAS, a technical innovation at the heart of Fusaka Vitalik Buterin has just unveiled an innovation that could transform the Ethereum ecosystem. The co-founder identified PeerDAS (“Peer Data Availability Sampling”) as the key to scaling the network and its sustainability in the face of ever-growing demand. Concretely, PeerDAS allows nodes to verify the existence of a data block without downloading it entirely. Instead of hosting the whole file, they rely on samples, then recomposed thanks to erasure coding. This method, already proven in cybersecurity, fragments the data, adds redundancy, and then enables reconstruction even in case of partial loss. This breakthrough breaks a historic constraint of Ethereum: each node is no longer forced to store all the data to contribute to the network. The result is twofold: increased capacity for transactions and preserved decentralization. Buterin also highlights the system’s resilience: even if several actors act maliciously, the presence of a single honest validator is enough to guarantee the integrity of the process. An architecture that protects Ethereum from potential attacks while enhancing its processing power. This evolution couldn’t come at a better time. Since the introduction of “blobs” with the Dencun upgrade, their use has exploded. In August, Ethereum hit a record with six blobs per block. Layer 2 solutions like Base, Scroll, or Linea already occupy most of this space, generating over $200,000 in fees each week. In this context, PeerDAS appears as a strategic response. By optimizing data management, it offers the network a way to absorb the growing demand without compromising its stability or decentralization. Ethereum adopts a progressive strategy facing a long-term challenge Buterin remains cautious, however. The number of blobs per block will not increase abruptly but in a phased progression. A too rapid scale-up, he warns, could create imbalances and put pressure on certain parts of the network. The Fusaka schedule reflects this gradual approach: deployment planned for December 3, 2025, preceded by public tests on several networks and accompanied by a security audit with 2 million dollars in rewards to identify possible flaws. But the challenge goes far beyond layer 2 alone. In the longer term, PeerDAS could also absorb part of the execution data of layer 1, thus freeing nodes from a currently colossal load. This mechanism would give Ethereum increased capacity to meet growing demand driven by DeFi, stablecoins, and asset tokenization, without sacrificing either the protocol’s neutrality or resilience. This evolution is part of an ambitious roadmap. After Pectra and before Glamsterdam, Fusaka is not just a technical upgrade. It represents a true strategic building block for Ethereum’s future. It reflects a clear desire to prepare the network to occupy a central place in global finance, at the very moment when banks, companies, and states increasingly consider blockchain as a critical infrastructure. Thus, PeerDAS is not a simple technical refinement. It is a direct response to the scalability and neutrality challenges Ethereum faces. By betting on an innovation deployed cautiously but thought over the long term, Buterin seeks to gradually transform the network. If Fusaka delivers as promised, Ethereum could reach a decisive milestone and confirm its ambition: to become the essential infrastructure for global digital finance.
Ethereum co-founder Vitalik Buterin has identified Peer Data Availability Sampling (PeerDAS) as a crucial tool for addressing the network’s growing blob storage demands. PeerDAS is a feature of the upcoming Fusaka upgrade. His remarks arrive as Ethereum records six blobs per block, a milestone that has intensified concerns about data bloat across the ecosystem. Blobs were introduced through EIP-4844 as temporary on-chain data containers, designed to lower costs for Layer-2 rollups while avoiding permanent storage pressure. Unlike call data, blobs expire after about two weeks, reducing long-term storage needs while preserving integrity for transaction verification. This structure makes rollups cheaper to operate and enhances Ethereum’s scalability. However, that design has spurred the rapid adoption of blobs across the blockchain network. On Sept. 24, on-chain analyst Hildobby reported that several Ethereum layer-2 solutions, including Base, Worldcoin, Soneium, and Scroll, now rely heavily on blobs. Considering this, the analyst pointed out that validators now require more than 70 gigabytes of space to manage blobs, warning that this figure could balloon to over 1.2 terabytes if left unpruned. This sharp increase has forced developers to look for solutions that balance scalability with storage efficiency. How PeerDAS works Buterin explained that PeerDAS will solve this challenge by preventing any single node from storing the entire dataset and distributing responsibility across the network. According to him: “The way PeerDAS works is that each node only asks for a small number of “chunks”, as a way of probabilistically verifying that more than 50% of chunks are available. If more than 50% of chunks are available, then the node theoretically can download those chunks, and use erasure coding to recover the rest.” However, he noted that the system still requires complete block data at certain stages, such as during the initial broadcast or if a block must be rebuilt from partial data. To guard against manipulation, Buterin stressed the importance of “honest actors” who fulfill these roles. He emphasized, however, that PeerDAS is resilient even against large groups of dishonest participants, as other nodes can assume responsibilities when needed. Increasing Blobs Buterin pointed out that Ethereum’s core developers remain cautious about deploying PeerDAS despite their years of research on the project. To minimize risks, they agreed to stage the rollout through Blob Parameter Only (BPO) forks rather than a single leap in capacity. The first fork, scheduled for Dec. 17, will raise blob targets from 6/9 to 10/15. A second fork, planned for Jan. 7, 2026, will increase limits again to 14/21. This phased approach allows developers to monitor network performance and adjust gradually. Buterin expects blob counts to rise with these changes, laying the groundwork for more aggressive increases later. In his view, PeerDAS will be vital for sustaining layer-2 growth and preparing Ethereum’s base layer to handle higher gas limits and eventually migrate execution data entirely into blobs. The post Home staking at risk as Ethereum data loads climb from 70GB toward 1.2TB appeared first on CryptoSlate.
Ethereum co-founder Vitalik Buterin said the core feature of the blockchain's Fusaka upgrade, PeerDAS, is the key to scaling the network. PeerDAS, short for Peer Data Availability Sampling, enables nodes to verify that block data exists without downloading or storing it all. Instead, nodes fetch smaller "chunks" of data, then use erasure coding to reconstruct the rest, Buterin explained in an X post . Erasure coding is a data protection technique that breaks data into pieces, adds redundant information, and distributes those pieces so the original data can be reconstructed even if some parts are missing. Buterin described the approach as "pretty unprecedented" because it removes the need for any single node to hold the entire dataset. In the first version of PeerDAS, full data of a block is still needed in limited cases — when blocks are first broadcast and when partial blocks need reconstruction. Even then, he emphasized that only one honest actor is needed for that "untrusted" role to function, making the process resistant to large numbers of dishonest participants, with future improvements also allowing these two functions to be distributed. Ethereum hits six blobs per block for first time Buterin's comments came in response to a thread by Dragonfly Head of Data "hildobby," who noted Ethereum had just hit six blobs per block for the first time. Blobs are fixed-size packets of transaction data introduced in Ethereum's Dencun upgrade , designed to give rollups cheaper temporary storage than regular calldata. Each block has a limited "blobspace," and the number of blobs per block — the blob count — directly affects how much transaction data can be posted to Ethereum by scaling solutions. According to hildobby, increased blob usage is being driven by activity from rollups like Base, World, Scroll, Soneium, and Linea, among others. Base and World alone are consuming most of the available blob space, with Layer 2s collectively paying around $200,000 per week in mainnet fees. Nevertheless, many blobs remain partially empty, and posting patterns are inconsistent, making blobspace harder to forecast, the analyst said. Average blob count per block. Image: hildobby . Buterin acknowledged these pressures and said blob counts will scale conservatively at first before ramping up more aggressively over time. This cautious rollout, he stressed, is deliberate — core developers want to thoroughly test the system before expanding capacity, despite working on it for years already. While blob counts determine how much data rollups can post per block, scaling them too quickly could put stress on the network. PeerDAS addresses this by letting nodes verify data availability through sampling rather than storing full blobs, which underpins the cautious approach to increasing blob counts over time. Longer term, Buterin sees PeerDAS as key, not just for Layer 2 scaling, but for the Ethereum base layer as well. Once the gas limit rises high enough, he argued, even Layer 1 execution data could be moved into blobs. That would further reduce strain on nodes and unlock scaling headroom, allowing Ethereum to handle greater demand without sacrificing decentralization. Last week, Ethereum developers tentatively set a date of Dec. 3 for Fusaka's launch on mainnet, pending successful testnets rollouts next month. The Ethereum Foundation also launched a four-week audit contest for Fusaka, offering up to $2 million in rewards for security researchers who uncover bugs before the hard fork reaches mainnet.
Delivery scenarios