Improve robot performance
  • 17 May 2024
  • 1 Minute to read
  • Contributors
  • Dark
    Light
  • PDF

Improve robot performance

  • Dark
    Light
  • PDF

Article summary

How can I improve robot performance?

A single robot execution will run from start to finish in a single processing thread, meaning that it will select one button at a time and visit one page at a time.

To increase efficiency, you can split your robot into two:

  • one that gets all the URLs of the pages to visit and
  • one that takes those URLs as input.

This allows the robot to visit pages concurrently (up to your account's concurrency limit), significantly speeding up execution time.

Note

Be aware that increasing concurrency should be done with care and respect for the target site to not interrupt services or cause excessive load/stress on the site.

For smaller sites, stay below a maximum of 10 concurrent robots. For sites that experience larger amounts of traffic, you can probably go a bit higher.

Always read the terms and policies for the site you're scraping to ensure you're complying with their terms.

How do I disable images, stylesheets and Javascript?

You cannot globally disable images, stylesheets or javascript with a single click, but you can prevent specific network requests which will speed up load time.

How to block or ignore network requests

Blocking network requests for certain unnecessary elements can improve robot performance.

When using the robot editor,

  1. Select the Network tab to get an overview of the network traffic involved in a page request.
  2. Select the URL icon to mark scripts and elements for blocking or ignoring.
  3. Here, you can block/ignore specific URLs/file types or entire domains.

Screenshot 2021-03-29 at 13.06.24.png

By default, we block Google Analytics and other tracking scripts; we don't want to skew the analytics data of any Web site we scrape.

14.png


Was this article helpful?