<?php
// create curl resource
$ch = curl_init();
// set url
$url = "https://www.google.com";
echo "URL: " . $url . "<br>";
curl_setopt($ch, CURLOPT_URL, $url);
//return the transfer as a string
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
// $output contains the output string
$output = curl_exec($ch);
echo "Output: <br>" . var_dump($output);;
echo "CURLINFO_HTTP_CODE: " . curl_getinfo($ch, CURLINFO_HTTP_CODE);
// close curl resource to free up system resources
curl_close($ch);
?>
$this->socket = @stream_socket_client($host . ':' . $port,
$errno,
$errstr,
(int) $this->config['timeout'],
$flags,
$context);
I am sorry for having to say that, but:The service provider tried to access Google with wget from the web server hosting my forum. And it worked. Where do I go from here?
In another thread I saw that the Sitemap URL nowadays is /sitemap/sitemap.php. Should I change the robots.txt to that?User-agent: *
Disallow: /account/
Disallow: /find-new/
Disallow: /help/
Disallow: /goto/
Disallow: /login/
Disallow: /lost-password/
Disallow: /misc/style/
Disallow: /online/
Disallow: /posts/
Disallow: /recent-activity/
Disallow: /register/
Disallow: /search/
Disallow: /admin.php
Disallow: /index.php?account/
Disallow: /index.php?find-new/
Disallow: /index.php?help/
Disallow: /index.php?goto/
Disallow: /index.php?login/
Disallow: /index.php?lost-password/
Disallow: /index.php?misc/style/
Disallow: /index.php?online/
Disallow: /index.php?posts/
Disallow: /index.php?recent-activity/
Disallow: /index.php?register/
Disallow: /index.php?search/
Disallow: /admin.php
Allow: /
Sitemap: https://www.mydomain.com/sitemap/sitemap.xml.gz
Perfect. And that's the URL I will submit to Google Console as weel?If thisrobots.txt
is for a XenForo website, the correctSitemap
entry would behttps://www.mydomain.com/sitemap.php
We use essential cookies to make this site work, and optional cookies to enhance your experience.