Robots.txt Generator

تحسين محركات البحث

Robots.txt Generator

الافتراضي - جميع الروبوتات:  
خريطة الموقع: (اتركه فارغًا إذا لم يكن لديك) 
بحث الروبوتات: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo MM
  Yahoo Blogs
  DMOZ Checker
  MSN PicSearch
الدلائل المقيدة: المسار بالنسبة إلى الجذر ويجب أن يحتوي على شرطة مائلة زائدة "/"

الآن ، قم بإنشاء ملف "robots.txt" في دليلك الجذر. انسخ النص أعلاه والصقه في الملف النصي.

عن الموقع Robots.txt Generator

Robots.txt generator SEO Tool used to generate robots.txt file instantly for your website. Whenever a search engine crawls any website, it always first looks for the robots.txt file that is located at the domain root level Ex : (

Robots.txt is a text file webmasters create to instruct web robots (typically search engine robots) how to crawl pages on their website. The robots.txt file is part of the the robots exclusion protocol (REP), a group of web standards that regulate how robots crawl the web, access and index content, and serve that content up to users. The REP also includes directives like meta robots, as well as page-, subdirectory-, or site-wide instructions for how search engines should treat links (such as “follow” or “nofollow”).

Structure of a Robots.txt File

The structure of a robots.txt is pretty simple (and barely flexible) – it is an endless list of user agents and disallowed files and directories. Basically, the syntax is as follows:



User-agent” are search engines' crawlers and disallow: lists the files and directories to be excluded from indexing. In addition to “user-agent:” and “disallow:” entries, you can include comment lines – just put the # sign at the beginning of the line:

# All user agents are disallowed to see the /temp directory.

User-agent: *

Disallow: /temp/

The Traps of a Robots.txt File

When you start making complicated files – i.e. you decide to allow different user agents access to different directories – problems can start, if you do not pay special attention to the traps of a robots.txt file. Common mistakes include typos and contradicting directives. Typos are misspelled user-agents, directories, missing colons after User-agent and Disallow, etc. Typos can be tricky to find but in some cases validation tools help.

The more serious problem is with logical errors. For instance:

User-agent: *

Disallow: /temp/

User-agent: Googlebot

Disallow: /images/

Disallow: /temp/

Disallow: /cgi-bin/