How do I stop these attacks?

Now that I have access again, is the TPU Spam add-on my best option? Is there anything else I should look for or do in cpanel?
I have it installed on my active site (and one that is now inactive) and have never had an issue with any human spam like that... and those sites have been out on the 'net for a while so if they were going to get hit, you'd figure that they already would have.

if you were on a VPS/dedicated server, it would be much easier. I don't think that cPanel allows blocking based upon location, only IP/CIDR.
 
How much transfer is included with your account? That's not a crazy amount you are pulling there by any means.

I respectfully disagree. Looking at that last chart, at one point the site was pushing 12 Megabytes a Minute.

Assuming I did my math correctly (someone please feel free to correct it)

12 Megabytes/Minute x 60 seconds x 8bits = 5760 Megabits per second.

That's roughly 5.76Gbps.
 
I respectfully disagree. Looking at that last chart, at one point the site was pushing 12 Megabytes a Minute.

Assuming I did my math correctly (someone please feel free to correct it)

12 Megabytes/Minute x 60 seconds x 8bits = 5760 Megabits per second.

That's roughly 5.76Gbps.

Your math is way off. You multiplied where you need to divide.

12 Megabytes per minute is about 96 Megabits per minute. Divide that by 60, and you're looking at about 1.6 Megabits per second. That's a tiny amount of bandwidth.

According to the chart, the average is around 3MB/minute anyway.....around 4GB per day.



What I'm saying is that in the grand scheme of things, it's not a whole lot of data. This isn't something that is going to bring their servers down and cause issues for other clients. However, it's a HUGE amount of data for a forum that is closed. What could spambots possibly be pulling from a closed forum that is causing that kind of data usage? I guess if you have 1000 people pulling the forum logo or such constantly, it could do it, but that's a tad ridiculous.

That's why to me, I think the OP might be approaching things incorrectly by assuming it's spambots. It kinda has small Layer 7 DDoS attack written all over it. But I don't know.
 
20gigabytes is different for everybody.

I'm in the lul period at the moment.

It is interesting how the traffic over the days built up.
 

Attachments

  • Screenshot_20161123-074517.webp
    Screenshot_20161123-074517.webp
    21.9 KB · Views: 5
  • Screenshot_20161123-074523.webp
    Screenshot_20161123-074523.webp
    33.4 KB · Views: 6
Just a thought. I've been seeing a lot of Majestic bots the past few weeks. Could this have anything to do with this spam attack?
 
After installing the spam blocker, the site got shut down again.

This is the message from the host:

Looks like many bots are crawling on your site and the hits are causing the bandwidth consumption. You will need to block the unwanted bots in the robots.txt file in under your account. I have enabled the IP block again on the domain.

He gave me a list to block. Where can I find the robots.txt file?
 
Looks like many bots are crawling on your site and the hits are causing the bandwidth consumption. You will need to block the unwanted bots in the robots.txt file in under your account.
Useless.

Can I block just bad robots?
In theory yes, in practice, no. If the bad robot obeys /robots.txt, and you know the name it scans for in the User-Agent field. then you can create a section in your /robotst.txt to exclude it specifically. But almost all bad robots ignore /robots.txt, making that pointless.

If the bad robot operates from a single IP address, you can block its access to your web server through server configuration or with a network firewall.

If copies of the robot operate at lots of different IP addresses, such as hijacked PCs that are part of a large Botnet, then it becomes more difficult. The best option then is to use advanced firewall rules configuration that automatically block access to IP addresses that make many connections; but that can hit good robots as well your bad robots.


He gave me a list to block. Where can I find the robots.txt file?
http://yourdomain.com/robots.txt
 
robots.txt is not totally useless. It's only useless against rogue agents that don't honor it... and for those there are IP blocks. :p
 
After installing the spam blocker, the site got shut down again.

This is the message from the host:

Looks like many bots are crawling on your site and the hits are causing the bandwidth consumption. You will need to block the unwanted bots in the robots.txt file in under your account. I have enabled the IP block again on the domain.

He gave me a list to block. Where can I find the robots.txt file?
your host is sucks, and if is shared hosting that will happened all time, change host and resolve problems!
 
I just realized I've had a robot.tx on my other site, and perhaps that's why I never had a problem with spam bots.

This is what I have:

User-agent: *
Disallow: /forum/find-new/
Disallow: /forum/account/
Disallow: /forum/attachments/
Disallow: /forum/goto/
Disallow: /forum/posts/
Disallow: /forum/login/
Disallow: /forum/events/
Disallow: /forum/profile/
Disallow: /forum/calendar/
Disallow: /forum/misc/
Disallow: /forum/search/
Disallow: /forum/members/
Disallow: /forum/register/
Disallow: /forum/online/
Disallow: /forum/recent-activity/
Disallow: /forum/lost-password/
Disallow: /forum/admin.php
Disallow: /forum/help/
Allow: /

Should I add the list of bots the host gave me below this?
 
and robots text looks like this

User-agent: *
Disallow: /find-new/
Disallow: /forums/-/
Disallow: /account/
Disallow: /attachments/
Disallow: /goto/
Disallow: /posts/
Disallow: /login/
Disallow: /admin.php
Allow: /
 
Top Bottom