This website uses cookies to ensure you get the best experience. Cookies necessary for the correct operation of the site are always enabled. Learn More
OK

Methodology

How We Select Topics and Formats

Our filter is simple:
  • Will this help readers right now?
  • Can we prove it with publicly available facts?
  • Can we add something useful?
For each pitch we check search intent, recency of facts, and whether we can add hands-on insight, not just collect links.

Rankings focus on proof of work. Candidates must show a public track record. We check named clients, case studies with outcomes, client references, third-party mentions, years active, team transparency, and specialization. If a company is new, it needs stronger evidence, such as multiple verifiable references.

Strategy guides must map to real use cases for Web3 marketing teams. The topic must solve a common task for crypto projects and lead to actions that can be measured. We favor areas where we can add steps, tools, and templates, not just opinions.

Trend reviews cover themes with traction, not noise. We pick themes with real adoption signals. We watch on-chain metrics, developer activity, funding rounds, user cohorts, and partner integrations.

Metrics and Weights

We score topics on a 100-point scale. Each article type has its own mix.

Rankings of agencies and projects:
  • Proof of results 35. Named case studies with outcomes, repeat clients, public KPIs.
  • Client proof 20. Verifiable references, testimonials with names, third-party listings.
  • Transparency 15. Team, services, pricing ranges, leadership profiles.
  • Expert signal 15. Speaking gigs, open materials, community education.
  • Reach and ER 10. Social footprint and engagement rate from public channels.
  • Freshness 5. Data or work shipped in the last 12 months.
Marketing strategy guides:
  • Practical value 30. Clear steps, tooling paths, templates, example timelines.
  • Outcome clarity 20. What success looks like and how to track it.
  • Evidence 20. Screens, datasets, public benchmarks, named sources.
  • Reader demand 20. Search interest and common client requests.
  • Maintenance cost 10. Update cadence and risk of going stale.
Web3 trend reviews:
  • Adoption data 35. Active users, DAU or MAU, on-chain volume, TVL, retention.
  • Developer signal 20. Commits, repos, releases, contributor count.
  • Capital flows 15. Rounds, grants, ecosystem funds, token unlock impact.
  • Integrations 15. Wallets, protocols, exchanges, partner SDKs.
  • Reach and ER 10. Social and search interest with sustained traction.
  • Freshness 5. Signals stable across a 90-day window.
Weights can shift by ±5 for niche topics.

Data Sources We Trust

Our work starts with public, verifiable data. We pull on-chain stats from block explorers and open dashboards, then cross-check with independent analytics. We review company sites, pricing pages, case studies with named clients, audit reports, and official docs.

We track community health through X, Telegram, Discord, and long-form channels. Developer activity comes from repos and release notes. Funding and traction are checked against market databases and public announcements. Press, research, and ecosystem hubs help confirm context.

What we do not accept: anonymous tips without proof, unverifiable screenshots, scraped paywalled content, or private data shared without permission.

Content Quality and UX

Every article must read fast, look clean on mobile, and back claims with clear proof. We run the checks below before publish.

  • Clarity. Direct voice, clear headers. Every section answers one question and ends with a takeaway a reader can act on. We avoid long intros and slow ramps.
  • Readability. Median sentence length 18 words or less. Plain words over jargon.
  • Structure. H2 every 300 to 400 words. Logical H3s. A short summary up top.
  • Speed. Pages load fast on shaky mobile. Images are compressed without losing legibility. No heavy embeds without a reason and a fallback.
If any line fails, we fix it before the piece goes live.

How We Edit and Recheck

Every article goes through 3 stages: writing, review, upkeep. The review part includes a peer check for structure, and a final check for data and context. After publishing, we tag the date and revisit the piece on a set timer. High-traffic articles or rankings get looked at every quarter. Slower ones – twice a year. If someone flags an issue or submits a correction request, we move it up in the line and act within a few workdays.

Disclaimer

Our articles are research and opinion for education. They are not investment, legal, or tax advice. Make your own decisions and accept the risks. Check local rules in your country before acting.

Each page is a snapshot in time. Markets move fast. Metrics can change after we publish. We rely on public and third-party sources. Errors can exist. We save links and screenshots for key claims.

Rankings and picks reflect our method and editor judgment. They are not endorsements or guarantees of performance. The lists are not exhaustive.