As reported, a few days ago, I built a sitemap manually for a long forum thread, and the results are finally in.
As expected, Google never crawled most pages of these long threads. These threads are not chit-chat or meme threads or "say hello and post an introduction" kind of threads for newbies. We're talking about thousands of pages with extensive, unique, valuable content, people writing lengthy posts discussing science and economics. If these were WordPress or Medium pages, they would be crawled and indexed in a split second. XenForo's pagination is simply insufficient to signal to Google the importance of these pages.
As soon as I submitted every page of that thread in a separate sitemap, Google began to report status on these URLs. They are now shown as "Discovered," but note that the last crawl date is "Never." That's different from "Crawled, not indexed," which is for pages that Google determines are not worthy of indexing. These were excluded simply because Google's bots never got to them. This just breaks my heart.