Hard to say without the numbers showing. But most of the time nothing to worry about. 403 errors only tell that the page is blocked for guests (and google who is a guest) as it should.
I have 1.2 mil pages excluded, many/most because of blocking with robots.txt. I want those excluded/not crawled so that is ok.
Indexed but blocked by robots.txt will drop after some time if those pages can't be reached from an other site (url). Pages with redirect you don't want index so that is normal (like /posts) The rest are just things you can use to check if everything is being crawled like you want.
Soft 404 for example can also happen with low content pages as google states. The numbers are also not 100% correct and can vary over time depending on updates google does. My 404 pages dropped over time itself. Crawl anomaly's can happen for different reasons and stays the same most of the time.
Not found (404) is when pages are removed. Duplicate without user-selected canonical happens with attachments for example, all other pages on xf have correct rel canonical when needed. These have also dropped over time.