Sahi Documentation

Web Crawler APIs

abstract Web Crawler APIs are used to navigate to all pages of a website to check if the links are working.

_crawlWebsite

Since: Sahi ProSahi OSSahi Pro StarterDesktop Add-OnMobile Add-OnSAP Add-OnAI Assist Add-On
9.0.0NANANANANANA

Available for modes: Browser

_crawlWebsite($websiteURL, $depth, $csvOutputFilePath)

Arguments
$websiteURLstring URL of the website to crawl
$depthinteger Specify how deep Sahi should navigate in each link
$csvOutputFilePathstring Path of output csv file

Returns
null

Details

This api is used for testing links on a website. Sahi will navigate to all the links and store the information in a csv file. This csv file contains the testpage, number of links found and any error found on each link (errors: Javascript error on link click or network error).
// Verifies all the links present in sahitest.com
_crawlWebsite("http://sahitest.com", 1, "output1.csv");

// Verifies all the links present in sahitest.com and also verifies all the links present in each link of sahitest.com
_crawlWebsite("http://sahitest.com", 2, "output2.csv");

// Use _artifact API to save a copy of the output file and access it from Sahi Reports page.
_artifact("output1.csv");