Google Traffic Going Down After Migration

Here's an update:

We migrated from vBulletin 4.x to XenForo on December 01, 2011.

The Good: Google's got a *LOT* more links indexed on its own; without sitemap! Currently we've about 305k links which is 'way' up from around 150k links indexed by Google with vB 4. I'm aware that number of links indexed is not an indicator of the traffic; but the point is - Google's finding Xenforo pages and indexing them.

We've just added a sitemap as an experiment and hope to have some data to post here.

The Bad (?) : The traffic is still recovering but it's on constant growth path. I'm now convinced that it'll take ~6 months for the traffic to completely recover. Being an Engineering website; out traffic peaks between January - March and then July - October. I'm sure our July 2012 results are going to be hell more interesting than ever before.

The registration rate is down. Maybe the spammers don't register on our site anymore. I'm not sure though.

The Best: The members are absolutely in love with the site. They now stay longer and contribute more. Our returning visitor rate's gone up and a lot of our oldies are back to the site. I must say that 'likes' system is a killer feature.

Future: We're embarking on newer projects with the community and hope that they'll drive the community to newer levels.
 
Likewise thanks for the update. I agree with you about allowing ~6 months to recover, which is why it is always sensible to plan the conversion for your quiet period (if you have one). Other people also do seem to say more XenForo pages get indexed. I remember my vB site had 200,000 entries in the sitemap and even years later only about 40,000 were indexed. I'm looking forward to seeing how my conversion gets on. 200 pages so far in the first 24 hours.
 
Thanks for sharing the experience - I also would assume a few months for goog to catch up with thousands of threads posted in a new place.
I have to wonder if there is a time frame to actually remove the old redirects - that is, if having lots of redirects on an ongoing basis affects SEO?

From my limited experience, it would seem that once google finds and indexes all the new stuff that the redirects would be of little use - and even possibly of harm...

Any opinions on that one?
 
I can't see how redirects would do any harm - they're just signposts to where the content is now, and since "hard" links to your content can hang around on other people's sites for years and years, I'd think it was a definite plus to leave the redirects in place.

All that's going to happen if you remove them is a bunch of 404's - where's the fun in that for people wanting to visit your site ... (y)
 
Once Google has processed all the redirects, you may still get traffic coming in from old links such as non-updated directories, blogs, people's old bookmarks and other sources (including links within your own old forum posts, where members link to another post). For that reason IMO I'd just leave them in place, unless anyone has concrete evidence it could affect SEO.
 
I think it might be something to study.....it seems possible to me.
Consider that even after that time, redirects would usually be set (for non-existing pages) to go to the forum home or another site map or directory page.
This is purely speculation, but I think there is a possibility that google approves more of THEM finding everything properly - eventually, that is, as opposed to redirects. But even if so, I would wait a year or so to do so....

Of course, we have a lot of redirects to the old forum inside the forum itself - that would be an advantage of leaving redirects in place.

Well, I'll get to report on this soon enough - I have 15+ years of records and history on our stats......
 
I think the number of URLs listed and the traffic to the site are not related. After migrating to XF, Google's listing 2x more URLs but the average position of the links in search engines is bit down. I found out that we are listed #1 for few search terms with the content that's *new*. But the links that used to be #1 when we were on vB are now down to #3 - #8. That might have caused the traffic to go down.

But as I said, people are *loving it* and when people love your site; traffic is bound to grow in coming days. I expect our traffic to hit new records starting June/July. Until then, I'm focusing on building new stuff, creating amazing content and bringing back old members :)
 
Redirects do not affect SEO, never have, can't see it ever negatively affecting anything in the future. It is search engines own policies that they recommend you use redirects to ensure spiders find the content, not dead links.

As mentioned already, if you move your site a folder, or new domain, you have a site full of existing internal links posted in content. This alone is enough to cause damage to your site if you remove the redirects, because suddenly a spider finds all these dead links in your sites pages, thus your credence with search engines just dropped a notch.

You also cannot any longer cite #1 positions or such in Google, because there is no such thing as Google Global anymore. It doesn't exist. Based on your physical location, you will get different results, regardless how good the site is. The only result you should be close to assured as #1 is a direct site name match / domain name search. It ends there nowadays due to local search algorithms which use your physical location to base your results upon.

Shifting from any alternative forum software to XF WILL have consequences, no different than switching to any new software. It has nothing to do with anything other than the site pages are now newly coded, thus they present in completely different aspects to the algorithm.

Take an existing static page, let it rank, then change the code to switch a column, how many posts are on a page, etc, at a test... you will completely change your entire rankings for your site. If you ever changed your existing forum posts per pages figure before, then monitored your statistics, you would understand this principle better. By doing such, you literally have completely changed the entire pages interpretation due to introducing enough text to make a difference to an algorithm.
 
That means our upcoming wordpress redesign is going to hurt us a bit. I thought the 'code' does not matter, as long as the text, URL etc are retained.
 
The code depicts the text. If the text is absolutely identical and the code delivers the page in raw format the exact identical way, then it would not affect it.

If you changed the order of text delivery in raw format, ie. right column to left column, and the code literally shifted before to after the main body, then you have now changed the meaning of that page according to Googles mathematical algorithm.

If you add new words, etc, a slight few words missing or few additional words, may be all that is required to change the meaning of the page to a mathematical algorithm.

What is the outcome of such changes? Well, nobody ever has the answer until you actually make the changes and observe the results. This is why once I used to do this on client sites, I would warn them about initial fluctuations. After that, very minor tweaks where done to test page by page basis, whether the page went up or down, or revert the page back again, etc.

This was all lovely when one could just make a landing page that could fool Google, but it's not that simple now.

If nothing other than a theme change was done, yet the theme reorganised the complete layout of the page, that could change and produce fluctuations.

The beauty with XF is that it uses HTML5, which Google will completely drop specific elements out of their equation and use only body content, ie. footer, nav, sidebar, etc... HTML5 allows for the clear and definitive isolation of such areas for Google to dismiss, along with browsers identify content for inbuilt readers, etc, ie. Safari.

If your existing WP theme is HTML5 and your new one HTML5, then it may not make a difference. But if you are upgrading to HTML5, you may notice slight fluctuations, depending on how much content exists.

Google is pretty smart, and even without HTML5 it can identify certain areas based on how the information is portrayed to them, ie. a massive long bullet list of word / few words linked, is likely navigation, thus they dismiss it. Footer location content, dismissed, etc. But it isn't as good as how they manage HTML5 due to specific tags existing to denote such areas, regardless where they lay in the page layout in raw format.

A few pages doesn't mean a thing... meaning, 100 is even a few in the scheme of radically affecting traffic when talking blogging / forums vs. static websites which are easier to manipulate and control changes.
 
Top Bottom