Cyotek WebCopy is a free tool to copy full or partial websites locally to your hard drive for offline viewing. It will scan the specified website and download its content to your hard drive. Links to resources such as style sheets, images, and other pages on the website will automatically be reassigned to match the local path. Using its extensive settings, you can define which parts of a website will be copied and how. This software can be used for free, but as with all free software, there are costs involved in developing and maintaining.
What can WebCopy do?
The Web Copy Tool will examine the HTML markup of a website and try to discover all linked resources such as other pages, images, videos, file downloads, anything and everything. You’ll download all the thesis resources and keep looking for more. In this way, WebCopy can “crawl” an entire website and download everything it sees in an effort to create a reasonable facsimile of the source website.
What can’t WebCopy do?
You don’t download the raw source code from a website, you can only download what the HTTP server returns. While you will do your best to create an offline copy of a website, advanced data-driven websites may not work as expected once copied.
Features and highlights
The rules control the behavior of the scan, for example by excluding a section of the website. Additional options are also available, such as downloading a URL to include in the copy, but not crawl.
Forms and passwords
Before analyzing a website, you can publish one or more forms, for example, to log into an administration area. HTTP 401 challenge authentication is also supported, so if your website contains protected areas, you can pre-define usernames and passwords, or you will be automatically prompted for credentials while scanning.
After you have analyzed your website, the Link Map Viewer allows you to see all the links found on your website, both internal and external. Filtering allows you to easily see the different links found.
There are many configurations you can make to configure how your website will be crawled, in addition to the rules and forms mentioned above, you can also configure domain aliases, user agent strings, default documents, and more.
After scanning a website, you can see lists of pages, errors, missing pages, multimedia resources, and more.
Several configuration options make use of regular expressions. The built-in editor allows you to test expressions easily.
View and customize a visual diagram of your website, which can also be exported to an image.
Nota: Requiere .NET Framework.