Monitor, archive, go back in time.. Website Watchman is an easy to use website archival utility.
- Monitor a whole website, part of a website or a single page
- Set up configs for multiple sites / pages
- Schedule hourly, daily, weekly, monthly scan
- Be alerted to any changes, visible text, source code or changes to the page’s resources
- View and be able to demonstrate what a page looked like on a particular date
- Be aware of every change to a competitor’s page / site
- Runs locally, not a cloud service. Own your own data.
- An archive is kept, including all changes to pages, images, style sheets and js
- View a ‘living’ version of a historical page, not a screenshot
- Switch between versions of the page to compare them
- Export a historical page as image or collection of all of its files
- Export the entire site, preserving all files as they were on a given date, or processed to make a browsable local copy of the site
- Now able to archive all files from a website (where the url is discovered). The new switch is added to the 'Filtering' dialogue. The default behaviour is as before, only certain files are archived; pdf and those that are part of the page, css, js, images. Beware - turning the filter off can result in downloading files that may be large such as zip and audio / video.
- Various small fixes including: Some pages wouldn't display properly in the archive browser or after export with processing, if they had a base href giving an absolute base address. This is correctly now removed in those situations.
macOS 10.12.0 or later, Intel 64