HELP

Hi Base44 team,

I have a critical SEO issue with my production domain:

https://kryonaiapp.com

The robots.txt currently served is:

User-agent: * Disallow: /

This is blocking Google from crawling my entire site.

Important context:

  • I already created public SEO pages (/como-funciona, /ejemplos, etc.)

  • I submitted my sitemap to Google Search Console

  • Some pages are already indexed, but crawling is still being blocked by robots.txt

I also created public/robots.txt with:

User-agent: * Allow: /

Sitemap: https://kryonaiapp.com/sitemap.xml

However, it seems Base44 is overriding it at the platform level.

Request: Please remove the global Disallow or allow crawling for my domain.

This is urgent because it completely blocks organic traffic.

Thanks.

Please authenticate to join the conversation.

Upvoters
Status

In Review

Board
πŸ’‘

Feature Request

Date

About 3 hours ago

Author

Paul Orozco A.

Subscribe to post

Get notified by email when there are changes.