Advanced SEO Controls for Discovery Platforms

I’m currently building a discovery-based platform on Base44 with a large number of dynamic public pages (projects, creator profiles, articles, and curated listings).

Base44’s auto-generated sitemap.xml and robots.txt work well for simple apps. However, for discovery platforms at scale, SEO quality depends less on “what exists” and more on “what should be indexed.”

Some use cases where additional control would be extremely valuable:

  • Preventing thin or low-quality public pages from being indexed while keeping them accessible to users

  • Declaring canonical URLs for dynamic routes (e.g. slug-based pages)

  • Excluding filtered or utility-style pages from search results

  • Controlling which public pages appear in the sitemap based on metadata (verification status, content quality, completeness, etc.)

This isn’t about manual editing of robots.txt or sitemap.xml, but about giving creators a structured way to express indexing intent at the page or entity level.

As Base44 enables more complex, content-rich apps, having SEO signals that reflect content quality and intent would unlock significantly better long-term discoverability.

Thanks for considering — happy to provide concrete examples if helpful.

Please authenticate to join the conversation.

Upvoters
Status

In Review

Board
💡

Feature Request

Date

3 months ago

Author

Julie Guay

Subscribe to post

Get notified by email when there are changes.