- Affected version
- 2.3.10
Code:
ErrorException: IndexNow error: [429] We're sorry, but you have sent too many requests to us recently. src/XF/Error.php:81
Generated by: Unknown account Mar 10, 2026 at 9:50 PM
Stack trace
#0 src/XF.php(270): XF\Error->logError('IndexNow error:...', false)
#1 src/XF/IndexNow/Api.php(88): XF::logError('IndexNow error:...')
#2 src/XF/IndexNow/Api.php(34): XF\IndexNow\Api->request('/?key=wPqMCWRk8...', Array, 'IndexNow error:...')
#3 src/XF/Job/ContentIndexNow.php(85): XF\IndexNow\Api->index('https://raidgam...')
#4 src/XF/Job/Manager.php(275): XF\Job\ContentIndexNow->run(7.99195)
#5 src/XF/Job/Manager.php(205): XF\Job\Manager->runJobInternal(Array, 7.99195)
#6 src/XF/Job/Manager.php(89): XF\Job\Manager->runJobEntry(Array, 7.99195)
#7 job.php(46): XF\Job\Manager->runQueue(false, 8)
#8 {main}
Request state
array(4) {
["url"] => string(8) "/job.php"
["referrer"] => string(40) "https://******/threads/820/page-202"
["_GET"] => array(0) {
}
["_POST"] => array(0) {
}
}
IndexNow indicating that the allowed request limit has been exceeded. My question is whether this is indeed the case, because their documentation states that they have a daily limit of up to 10,000 URLs, but my forum does not generate such activity per day. My only suspicion is the robots.txt configuration where I blocked pagination for posts, but /posts/ itself was not blocked. I specified the maximum like this:
Code:
Disallow: /#post-
Disallow: /?page=