A robot is configured using the properties described below.
This tab contains the basic robot properties.
- Default Options:
- Here, you can configure the default options for the step actions of the robot.
These options are described here.
- Robot Comment:
- Here, you can enter a comment about the robot.
This tab contains various advanced properties.
- Robot Id:
This is a legacy property. A robot must have an id if you use the
Database Storage Environment
Database Message Environment.
This property contains an optional id for the robot. This id must be unique among all robots. If you use RoboManager, then you should use
that application to keep track of robot ids. You can register your robot in RoboManager and get an id for it by clicking "Register...".
If you do not use RoboManager, you will have to keep track of the robot ids yourself.
- Proxy Server:
- This property specifies an optional proxy server to use for all page and data loading done by this particular robot.
You should use this property only rarely. Normally, it is better to specify one or more proxy servers for the entire Kapow Katalyst installation.
This is most easily done in the Kapow Katalyst Settings application. See the Kapow Katalyst Installation Guide for further details on this.
The proxy server specified for a particular robot will override proxy servers specified any other way.
- HTTP Client:
- The client used to make HTTP requests to remote sites.
- NTLM Authentication:
Kapow Katalyst has built-in support for the NTLM authentication scheme over HTTP (both for proxies and target systems).
However, in some cases we have seen compatibility issues with some versions of Microsoft IIS. In case Kapow Katalyst cannot authenticate with your
system, an alternative NTLM authentication engine named JCIFS is available. To JCIFS, download the JCIFS library version
1.3.16 JAR file from http://jcifs.samba.org and place it in the lib folder of
your Kapow Katalyst installation directory and select "JCIFS" as the NTLM authentication to use in the
configuration of the robot.
- Enable Private HTTP Cache:
- Select this option to enable private HTTP caching. Pages received from a server marked with
Cache-Control: private contain information specific to a particular client and
are not stored in the global HTTP cache. To never cache such pages you should disable this
option. To store such pages in a robot specific cache you should enable this option. The downside to
enabling private HTTP caching is using more memory per robot. If you are running a large amount of
robots on the same server, you can disable this option to decrease their memory footprint.
- Private HTTP Cache Size:
- This property specifies the maximum amount of memory to use for the private HTTP cache. The size is
specified in kilobytes. You should beware of setting this number high because each
and every robot instance running could potentially use this amount of memory in addition to its
other state. All pages stored in the HTTP cache are compressed, so simple pages with text content will
require very little memory. Also note, that only pages with
Cache-Control: private or
similiar will ever be stored in the private HTTP cache. Pages marked for non-private caching will go
into the global HTTP cache shared by all robots.