XML Generator Sitemap Generator
Eng
Support Ticketing System
  • Support Ticketing System
MonFri, 11:0019:00 UTC+3:00
Indexing Cyrillic Domains

FAQ

Frequently asked questions on the work of generator (Sitemap FAQ)

Please, read this information

See the - video tutorial on how to work with our service.

If you are having any problems with website indexing:
Why the website is not indexing: possible reasons
For the user of Updateable Sitemaps service:
Connecting gateway to your website
If you have any problems about payment:
Payment through WebMoney
Payment through QIWI
Payment through credit card

FAQ contents

What are the requirements for the domain homepage, so that the crawler could index my website?
The page should be accessible, return HTML-code and contain links to the internal pages. Website scanning process occurs based on the links, which will be found on the homepage. Homepage also may contain intra-host server redirect, which will be processed.

What types of websites URL does the generator accept?

  • International domain names (gTLD)
  • Internationalized domain names (IDN)
  • National domain names (ccTLD)
Generator only accepts for processing those forms of URL, which are presented as actual domain names. It can be any level domain. Since January 2011 was introduced support for indexing Cyrillic domain names, including . domain names!
URL Examples:
  • http://mydomain.com
  • http://sub.mydomain.eu
  • http://-.

What is the limit on the number of indexed pages in free generator?
The free version of generator may index up to 500 website pages.
Redirects and non-working links are not included in this number.

Will be the indexing limits, specified in robots.txt, taken into account during indexing?
This is optional. If this option is ticked, generator will follow the Allow and Disallow instructions in general section of User-agent: *
Personal sections of User-agent: Google or User-agent: Yandex are considered when choosing the appropriate crawler identification type as search robot.
In addition, you may create section User-agent: Mysitemapgenerator, which will be taken into account when choosing the direct identification of our robots (Mysitemapgenerator-HTTP or Mysitemapgenerator-Mobile - for Sitemap mobile).

Indexing hidden pages (Deep Web)
Deep Web (also called the Deepnet, the Invisible Web, the Undernet or the hidden Web) web-pages, which are not indexed by search engines, because such pages do not have hyperlinks from accessible pages. For example these are the pages, generated through HTML-forms interface or frame content.
If you wish to discover and include such pages into Sitemap, please tick the appropriate options:

  • index forms (submit occurs without filling the form);
  • index frames (contents of <frameset> <iframe>).

What will happen to links included in <noindex> blocks or which contain nofollow attribute?
If the option is on they will not be considered.
Additionally, if needed, you may always specify ignoring only or only nofollow separately from each other.

How does the crawler process intrahost server redirects?
Redirects processing is available in both free version and in Sitemap Pro (paid version).
Crawler identifies the following standard HTTP status codes:

  • 301 Moved Permanently
  • 302 Found
  • 303 See Other
  • 307
If the page of your website will contain redirect on the same domain name, then crawler will index the page, specified in this redirect address.

Processing and deleting phpsessid and sessionID (session identifiers on PHP- and ASP-applications)
During the process of indexing your website may form session IDs. Our crawler processes and deletes session identifiers. To the Sitemap file all links will be input "clear", without identifiers, passed in URL - phpsessid (for PHP) or objects sessionID (for ASP). This helps to avoid pasting into Sitemap duplicate links, when robot receives the same page with different URLs.

Example of session identifier in PHP:

    http://site.com/page.html?PHPSESSID=123456session6789
Example of session identifier in PHP:
    http://site.com/(S(123456session6789))/page.html
Finally, the URL will be transformed back to a basic form:
    http://site.com/page.html

How are images added to Sitemap?
Generator allows to gather and add to the Sitemap file information on images*, located on your website pages. For URL-sections of each page, on which images will be found, corresponding information will be added, according to the Google Sitemap-Image protocol.
Next example shows part of the record in Sitemap file for URL http://site.ru/sample.html, which has two images:

 <url>
   <loc>http://site.ru/sample.html</loc>
   <image:image>
     <image:loc>http://site.ru/logo.jpg</image:loc>
   </image:image>
   <image:image>
     <image:loc>http://site.ru/photo.jpg</image:loc>
   </image:image>
 </url>

* Only information on the images, located on the indexed website, is input into Sitemap. Images, pasted into webpages from other servers are not considered.

How does filtering of heterogeneous content work? ?
Unlike the free version, where check of the links availability ends simultaneously with the end of the indexing process (when 500 URL were found), in the paid version of the generator check proceeds to the last link, even if the indexing is already completed. This guarantees that redirects or dead links will not be included into Sitemap.
Although this is in agreement with the Sitemaps protocol and is not an error, the possible presence of links, for example, redirect can cause a redirect corresponding warnings in Google Webmaster Tools on the presence of non-direct links in the site map.

What information is contained in report, created after a website was scanned?
In the event that our crawler will face difficulties or obstacles in the process of indexing your web site, a detailed report will be created. In a report you will be able to see grouped pages lists describing errors, among them - "Page not found", internal server errors, etc. Besides the errors, the report will contain information about all the detected server redirects.
Reports are available in paid version.

I have a very large website, what happens when number of scanned pages will go beyond the maximum allowed number of 50 000 URL?
By default large sitemap is broken down in accordance with the sitemap protocol and search engines recommendations you will get several Sitemap pages, containing 50 000 URL each.
Also you may choose the number of URLs per file by yourself.

How to use data filters?
Data filter convenient tool used dudring the creation of sitemap, which allows along with page URL to specify the following important data for search engines: priority of particular pages in relation to other website pages and updating mode.
Additionally, mask allows excluding particular pages from index, which are not needed in the Sitemap file.
Data filters can be applied either for separate pages (for this you need to input full URI of the pages), or for groups of the pages (for this you need to input a part of URL, which corresponds to all similar pages. For example: .jpg or /directory/files).
Please pay attention data filters are case sensitive!

How does function Get on e-mail work?
We recommend using this function if you have a large website and its scanning may take a long time. With this option, you dont have to wait when crawler finishes its work you can get the results directly to your E-mail. This feature is available both in paid version (you will get ready* Sitemap file to the specified address), and in free version of the generator (you get the link to download ready file from our server).

* If total size of created Sitemap files exceeds 10Mb you will get a link to download it from our server.

For how many does are created Sitemaps available for download at sent links?
Guaranteed time of storage on our server is:

  • For files created in free version - 2 days,
  • For files created in paid version - 7 days.

How to check the website indexing status?
All registered users may get information on every indexing and on websites, which are currently being indexed, in their personal account, section Generated sitemaps.
If you are not registered, then you can track the status of the indexing process via the express-check form, which is available at the bottom part of the homepage - mysitemapgenerator.com.

What determines the speed of my website indexing?
Indexing speed is dependent on the variability of many dynamic factors, such as the responsiveness of your server and the size of the loaded pages. That is why it is impossible to calculate beforehand.
Also, a large impact on the time for website crawling has its structure of internal pages relinking.

Can I stop the website indexing process before it is finished?
Such an opportunity is provided for registered users. In the personal account, go to Generated sitemaps. The table displays information about all of your created Sitemaps, as well as information about Web sites, which are being indexed at the moment. In order to interrupt the process of indexing, without waiting for the crawler to scan the entire site, click the Stop button. In this case, you will receive Sitemap, generated only on the basis of pages that have been indexed at the time of the stop.

How do I let search engines know about my Sitemap file?
to do it register your website in webmaster services, provided by search engines (example: webmaster.yandex.ru for Yandex or www.google.com/webmasters for Google). After registration you will be able to submit sitemap files from your account.
Another common way include in robots.txt the following line:

Sitemap: http://mysite.ru/mysitemapfile.xml
If you need to provide several sitemap files, please add the same line for each file:
Sitemap: http://mysite.ru/mysitemapfile1.xml
Sitemap: http://mysite.ru/mysitemapfile2.xml
Sitemap: http://mysite.ru/mysitemapfile3.xml

Choosing optimal indexing speed and load capacity on your web server
In the options of the crawler there are three levels of indexing speed, creating appropriate load capacities on the server being indexed:

  • Maximum - this load capacity is used by default. If you have a quality paid hosting, most likely you do not need to worry about creating a load while indexing your site. We recommend using this load value, which allows the crawler to index your site at top speed.
  • Average choose this load capacity, if your server requires a gentle mode of indexation.
  • Low level of load capacity, which allows indexing your site, creating a minimum load on the server. This load level is recommended for websites, located on a free hosting or for sites that require limited flow of traffic.
    We recommend that you select this mode when indexing sites located on UCOZ hosting servers.
    However, note that this level slows down the process of indexing your site.

Indexing as seen by the crawler
You may choose one of the identification options for our Web-crawler* (search robot), which does indexing of your website:

  • Standard browser crawler uses this option by default and is a recommended one. Your website will load the same way your regular visitors see it.
  • YandexBot this option is used to index your website as Yandex search robot sees it. Our crawler will be signed as the main Yandex indexing robot (YandexBot/3.0)
  • Googlebot this option is used to index your website as Google search robot sees it. Crawler will be signed as Google web-search robot Google (Googlebot/2.1)
  • Mysitemapgenerator use direct identification of our robot if you need separate control settings and an ability to manage website access
Pay attention to the features of robots.txt file processing when choosing different identification ways:
  • When choosing YandexBot or GoogleBot options only instructions for a particular robot are considered (User-agent: Yandex or User-agent: Googlebot respectively). General instructions of User-agent: * sections will be used only when personal ones are missing.
  • If you are using Standard browser or mysitemapgenerator.com - crawler will consider only instructions in general section of User-agent: *. Personal sections of User-agent: Yandex or User-agent: Googlebot and others are not considered.
* The feature is provided as User agent and is a subject to paragraphs 1.3, 1.4 of Public Offer.

   SitemapReviews All Reviews →
Add +

Follow Us!

How can we help you? Open a Support Ticket Share Service!

About Us

©Sitemap Generator - Online Generator
VisaMasterCard Secure