Anonymous user / 3.236.112.101 Log In Register
?
Wallet: 3.00
Daily Credits:
1.20 / 1.20

Online Website Link Checker

Checkout ? #

Item Description Item Price Your Price
Total

Examples #

Try a few example inputs:
  • [TRY] http://www.wikipedia.org
  • [TRY] http://www.fit2fat2fit.com, Depth set to 3
  • [TRY] http://web.mit.edu, Depth set to 2, disabled Report broken links only

See Also #

Description #

Online Website Link Checker is a handy tool for every webmaster. It allows you to check whether your web site contains broken links. People surfing the web are annoyed when they click a link that does not work and it does not matter whether it is link to a non-existent page on your web that you removed or renamed for whatever reason or whether it is a link to a third party web page that you can not control. A broken link on your web is your problem and Online Website Link Checker can help you reveal the problem as soon as possible.

Online Website Link Checker visits the web page you specify and creates a list of all links on that page. Then it checks whether all links in the list are valid. Almost all web sites contain more than one page and hence it would be time consuming to check the validity of all links of the web site if it was done separately for every page. This is why you can specify the depth of the link verification process in Online Website Link Checker and this is how you can check links on all pages of your web site easily. Online Website Link Checker will not only visit the source page you specify but also all pages of your web site that the source page links to. And again, if a web page linked from the source page links to another web page within your website, its links will be checked too etc.

Our engine is capable to fully analyze HTML (versions 4 and 5) and CSS codes and identify and verify links in all standard elements.

Usage #

Specify a URL of the web page to start from and click the "Check!" button. By default, the Depth value is set to 1, which means that Online Website Link Checker will check only the links that are found within the source page you have specified. If you set the depth to 2, the second level links will be checked too. The second level links are links on the pages within the same domain as the source page that are linked from the source page. Similarly, if you set the depth to 3, the third level links will be also checked and these are the links linked from the second level pages.

Note that if you link web pages of another domain, the links will be checked but regardless the depth value, the links from these third party pages will not be checked.

Use the Range option to specify which webpages should be checked for (broken) links. The Check pages within the specified subdomain only option makes the Online Website Link Checker to check only pages within a single subdomain1 that is specified in the URL parameter. By choosing the option Check pages within the whole domain you tell the checker to crawl across all subdomains of the URL's main domain.

You can also specify a maximum number of pages that the checker can crawl using the Maximum pages to crawl field. The default limit is 500 pages. You can increase the limit up to 10000 pages. However, if you set a limit to be higher than the default value, you have to pay extra credits for each 500 pages over the default limit. Note that your final cost is counted from the actual number of pages that the checker crawls. At first, we debit your account with as many credits as it is needed to cover the maximal limit you set. Then after the task is completed we credit back unused credits if any. For example, if you have enough credits in your account, you can safely set the limit to 10000, and if the website you check contains only 3350 links, you will be returned unused credits so that your final cost will be as if you set the limit to 3500.

There are many websites that employ robots.txt protocol. In short, the owner of a website puts the robots.txt file in the root of (sub)domain (e.g. http://www.google.com/robots.txt) and all the good-behaved web crawlers download the file before crawling the (sub)domain in order to find out which URLs are forbidden for crawling. Online Website Link Checker allows you to respect or disrespect the robots.txt protocol by simply checking or unchecking the Respect robots.txt option.

If you want to set up robots.txt rules specifically for Online Website Link Checker, please use odt as a user agent value. For example, to disallow our engine to visit /Test/ folder, use:

User-agent: odt
Disallow: /Test/

For webmasters who want to analyze their websites thoroughly, we provide options Check URLs in CSS and Check links in HTML forms. Use the first option if you want to check all URLs in style attributes, in <style> elements and in external CSS stylesheets. If the Check links in HTML forms option is set, URLs in form action attributes and inputs' formaction attributes are checked, i.e. <form action="URL"> and <input formaction="URL">.

You can also set whether or not you are interested to see valid links on the output. If the Report broken links only option is set, you will only see broken links in the final report. If it is not set you will be given a list of all links.


1 www.domain.tld and domain.tld are always considered as the same subdomain.

Limits #

  • URL – A URL address where the crawler should start looking for (broken) links.
  • Depth – An integer between 1 and 10. See the Usage section for more information.
  • Range – The Range option restricts where is the crawler supposed to look for the (broken) links.
  • Maximum pages to crawl – An integer between 1 and 10,000. You always pay for the real number of crawled pages only. See the Usage section for more information.

Other Limits #

Website Link Checker is restricted to run at most 45 minutes. Tool execution is terminated after the period elapses. The total size of Website Link Checker's output is limited to 32 MB. In case you reach this limit, please consider enabling the Report broken links only, which dramatically lowers the output size.

Link Checker API #

API for Website Link Checker is available in Online Domain Tools API.

Your task is running, this will take some time.

Please visit the sponsor of this tool before your task finishes, thank you. If you click the ad, it will open a new window, so you do not lose your task data.



Registered users can disable ads in their user preferences.

You can close this window in 5 seconds.