seo robots_如何为SEO优化WordPress Robots.txt

seo robots

Recently one of our readers asked us for tips on how to optimize the robots.txt file to improve SEO. Robots.txt file tells search engines how to crawl your website which makes it an incredibly powerful SEO tool. In this article, we will show you how to create a perfect robots.txt file for SEO.

最近,我们的一位读者向我们询问了有关如何优化robots.txt文件以改善SEO的提示。 Robots.txt文件告诉搜索引擎如何抓取您的网站,这使其成为功能强大的SEO工具。 在本文中,我们将向您展示如何为SEO创建一个完美的robots.txt文件。

Using WordPress robots.txt file to improve SEO
什么是robots.txt文件? (What is robots.txt file?)

Robots.txt is a text file that website owners can create to tell search engine bots how to crawl and index pages on their site.

Robots.txt是网站所有者可以创建的文本文件,用于告诉搜索引擎机器人如何在其网站上抓取页面并将其编入索引。

It is typically stored in the root directory also known as the main folder of your website. The basic format for a robots.txt file looks like this:

它通常存储在根目录(也称为网站的主文件夹)中。 robots.txt文件的基本格式如下所示:


User-agent: [user-agent name]
Disallow: [URL string not to be crawled]

User-agent: [user-agent name]
Allow: [URL string to be crawled]


Sitemap: [URL of your XML Sitemap]

You can have multiple lines of instructions to allow or disallow specific URLs and add multiple sitemaps. If you do not disallow a URL, then search engine bots assume that they are allowed to crawl it.

您可以使用多行说明来允许或禁止使用特定的URL并添加多个站点地图。 如果您不禁止使用URL,则搜索引擎漫游器会假定它们被允许对其进行爬网。

Here is what a robots.txt example file can look like:

这是robots.txt示例文件的外观:


User-Agent: *
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /wp-admin/

Sitemap: https://example.com/sitemap_index.xml


In the above robots.txt example, we have allowed search engines to crawl and index files in our WordPress uploads folder.

在上面的robots.txt示例中,我们允许搜索引擎对WordPress上载文件夹中的文件进行爬网和编制索引。

After that, we have disallowed search bots from crawling and indexing plugins and WordPress admin folders.

在那之后,我们禁止搜索bot搜寻和索引插件和WordPress admin文件夹。

Lastly, we have provided the URL of our XML sitemap.

最后,我们提供了XML网站地图的URL。

您是否需要WordPress网站的Robots.txt文件? (Do You Need a Robots.txt File for Your WordPress Site?)

If you don’t have a robots.txt file, then search engines will still crawl and index your website. However, you will not be able to tell search engines which pages or folders they should not crawl.

如果您没有robots.txt文件,那么搜索引擎仍会抓取您的网站并为其编制索引。 但是,您将无法告诉搜索引擎它们不应该抓取哪些页面或文件夹。

This will not have much of an impact when you’re first starting a blog and do not have a lot of content.

当您第一次创建博客且内容不多时,这不会产生太大影响。

However as your website grows and you have a lot of content, then you would likely want to have better control over how your website is crawled and indexed.

但是,随着网站的发展和内容的丰富,您可能希望更好地控制网站的爬网和索引方式。

Here is why.

这就是为什么。

Search bots have a crawl quota for each website.

搜索机器人对每个网站都有一个抓取配额。

This means that they crawl a certain number of pages during a crawl session. If they don’t finish crawling all pages on your site, then they will come back and resume crawl in the next session.

这意味着他们在爬网会话期间对一定数量的页面进行爬网。 如果他们还没有完成对您网站上所有页面的爬网,那么他们将返回并在下一个会话中继续爬网。

This can slow down your website indexing rate.

这可能会降低您的网站索引率。

You can fix this by disallowing search bots from attempting to crawl unnecessary pages like your WordPress admin pages, plugin files, and themes folder.

您可以通过禁止搜索漫游器尝试爬网不必要的页面(如WordPress管理页面,插件文件和主题文件夹)来解决此问题。

By disallowing unnecessary pages, you save your crawl quota. This helps search engines crawl even more pages on your site and index them as quickly as possible.

通过禁止不必要的页面,可以保存爬网配额。 这有助于搜索引擎在您的网站上抓取更多页面并将它们编入索引。

Another good reason to use robots.txt file is when you want to stop search engines from indexing a post or page on your website.

使用robots.txt文件的另一个很好的理由是,当您要停止搜索引擎将网站上的帖子或页面编入索引时

It is not the safest way to hide content from the general public, but it will help you prevent them from appearing in search results.

这不是向公众隐藏内容的最安全方法,但是它将帮助您防止内容出现在搜索结果中。

理想的Robots.txt文件应该是什么样的? (What Does an Ideal Robots.txt File Should Look Like?)

Many popular blogs use a very simple robots.txt file. Their content may vary, depending on the needs of the specific site:

许多流行的博客使用非常简单的robots.txt文件。 它们的内容可能会有所不同,具体取决于特定站点的需求:


User-agent: *
Disallow:
 
Sitemap: http://www.example.com/post-sitemap.xml
Sitemap: http://www.example.com/page-sitemap.xml

This robots.txt file allows all bots to index all content and provides them a link to the website’s XML sitemaps.

该robots.txt文件允许所有漫游器将所有内容编入索引,并为它们提供网站XML站点地图的链接。

For WordPress sites, we recommend the following rules in the robots.txt file:

对于WordPress网站,我们建议在robots.txt文件中使用以下规则:


User-Agent: *
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /wp-admin/
Disallow: /readme.html
Disallow: /refer/

Sitemap: http://www.example.com/post-sitemap.xml
Sitemap: http://www.example.com/page-sitemap.xml

This tell search bots to index all WordPress images and files. It disallows search bots from indexing WordPress plugin files, WordPress admin area, the WordPress readme file, and affiliate links.

告诉搜索引擎将所有WordPress图像和文件编入索引。 它不允许搜索引擎将WordPress插件文件,WordPress管理区域,WordPress自述文件和会员链接编入索引。

By adding sitemaps to robots.txt file, you make it easy for Google bots to find all the pages on your site.

通过将站点地图添加到robots.txt文件,您可以使Google漫游器轻松找到您网站上的所有页面。

Now that you know what an ideal robots.txt file look like, let’s take a look at how you can create a robots.txt file in WordPress.

现在您已经知道理想的robots.txt文件是什么样子,让我们来看看如何在WordPress中创建robots.txt文件。

如何在WordPress中创建Robots.txt文件? (How to Create a Robots.txt File in WordPress?)

There are two ways to create a robots.txt file in WordPress. You can choose the method that works best for you.

有两种方法可以在WordPress中创建robots.txt文件。 您可以选择最适合您的方法。

Method 1: Editing Robots.txt File Using Yoast SEO

方法1:使用Yoast SEO编辑Robots.txt文件

If you are using the Yoast SEO plugin, then it comes with a robots.txt file generator.

如果您使用的是Yoast SEO插件,则它带有robots.txt文件生成器。

You can use it to create and edit a robots.txt file directly from your WordPress admin area.

您可以使用它直接在WordPress管理区域中创建和编辑robots.txt文件。

Simply go to SEO » Tools page in your WordPress admin and click on the File Editor link.

只需转到WordPress管理员中的SEO»工具页面,然后单击文件编辑器链接。

File editor tool in Yoast SEO

On the next page, Yoast SEO page will show your existing robots.txt file.

在下一页,Yoast SEO页面将显示您现有的robots.txt文件。

If you don’t have a robots.txt file, then Yoast SEO will generate a robots.txt file for you.

如果您没有robots.txt文件,则Yoast SEO将为您生成robots.txt文件。

Create robots.txt file using Yoast SEO

By default, Yoast SEO’s robots.txt file generator will add the following rules to your robots.txt file:

默认情况下,Yoast SEO的robots.txt文件生成器会将以下规则添加到您的robots.txt文件中:


User-agent: *
Disallow: /

It is important that you delete this text because it blocks all search engines from crawling your website.

删除此文本很重要 ,因为它会阻止所有搜索引擎爬网您的网站。

After deleting the default text, you can go ahead and add your own robots.txt rules. We recommend using the ideal robots.txt format we shared above.

删除默认文本后,您可以继续添加自己的robots.txt规则。 我们建议使用上面共享的理想robots.txt格式。

Once you’re done, don’t forget to click on the ‘Save robots.txt file’ button to store your changes.

完成后,请不要忘记单击“保存robots.txt文件”按钮来存储您的更改。

Method 2. Edit Robots.txt file Manually Using FTP

方法2.使用FTP手动编辑Robots.txt文件

For this method, you will need to use an FTP client to edit robots.txt file.

对于这种方法,您将需要使用FTP客户端来编辑robots.txt文件。

Simply connect to your WordPress hosting account using an FTP client.

只需使用FTP客户端连接到您的WordPress托管帐户即可。

Once inside, you will be able to see the robots.txt file in your website’s root folder.

进入内部后,您将可以在网站的根文件夹中看到robots.txt文件。

Editing WordPress robots.txt file using FTP

If you don’t see one, then you likely don’t have a robots.txt file. In that case, you can just go ahead and create one.

如果看不到,则可能没有robots.txt文件。 在这种情况下,您可以继续创建一个。

Create robots.txt file using FTP

Robots.txt is a plain text file, which means you can download it to your computer and edit it using any plain text editor like Notepad or TextEdit.

Robots.txt是纯文本文件,这意味着您可以将其下载到计算机上,并使用任何纯文本编辑器(如记事本或TextEdit)进行编辑。

After saving your changes, you can upload it back to your website’s root folder.

保存更改后,您可以将其上传回网站的根文件夹。

如何测试您的Robots.txt文件? (How to Test Your Robots.txt File?)

Once you have created your robots.txt file, it’s always a good idea to test it using a robots.txt tester tool.

创建robots.txt文件后,始终最好使用robots.txt测试工具对其进行测试。

There are many robots.txt tester tools out there, but we recommend using the one inside Google Search Console.

那里有很多robots.txt测试仪工具,但我们建议您使用Google Search Console中的一个。

Simply login to your Google Search Console account, and then switch to the old Google search console website.

只需登录到您的Google Search Console帐户,然后切换到旧的Google Search Console网站。

Switch to old Google Search Console

This will take you to the old Google Search Console interface. From here you need to launch the robots.txt tester tool located under ‘Crawl’ menu.

这将带您进入旧的Google Search Console界面。 您需要从此处启动“抓取”菜单下的robots.txt测试工具。

Robots.txt tester tool

The tool will automatically fetch your website’s robots.txt file and highlight the errors and warnings if it found any.

该工具将自动获取您网站的robots.txt文件并突出显示错误和警告(如果发现)。

最后的想法 (Final Thoughts)

The goal of optimizing your robots.txt file is to prevent search engines from crawling pages that are not publicly available. For example, pages in your wp-plugins folder or pages in your WordPress admin folder.

优化robots.txt文件的目的是防止搜索引擎抓取不公开的页面。 例如,wp-plugins文件夹中的页面或WordPress admin文件夹中的页面。

A common myth among SEO experts is that blocking WordPress category, tags, and archive pages will improve crawl rate and result in faster indexing and higher rankings.

SEO专家之间的一个普遍神话是,阻止WordPress类别,标签和存档页面将提高爬网率,并导致更快的索引编制和更高的排名。

This is not true. It’s also against Google’s webmaster guidelines.

这不是真的。 这也违反了Google网站站长指南。

We recommend that you follow the above robots.txt format to create a robots.txt file for your website.

我们建议您遵循上述robots.txt格式为您的网站创建robots.txt文件。

We hope this article helped you learn how to optimize your WordPress robots.txt file for SEO. You may also want to see our ultimate WordPress SEO guide and the best WordPress SEO tools to grow your website.

我们希望本文能帮助您学习如何针对SEO优化WordPress robots.txt文件。 您可能还想查看我们的终极WordPress SEO指南最好的WordPress SEO工具,以发展您的网站。

If you liked this article, then please subscribe to our YouTube Channel for WordPress video tutorials. You can also find us on Twitter and Facebook.

如果您喜欢这篇文章,请订阅我们的YouTube频道 WordPress视频教程。 您也可以在TwitterFacebook上找到我们。

翻译自: https://www.wpbeginner.com/wp-tutorials/how-to-optimize-your-wordpress-robots-txt-for-seo/

seo robots

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值