Before creating robot processes, first briefly perform manipulations on the target screen without worrying about logic, then verify what you can implement in BizRobo! Basic.
Anticipate workload of investigating and shooting problems (e.g. BizRobo! Basic is not correctly displayed, you cannot manipulate a page etc.), do what you can during the first half of the project to bring to light possible issues and challenges, and start designing your robot only after that.
Keep the robots design as simple as possible and make multiple simple robots instead of a single complex one.
Respect the "1 robot = 1 function" rule and avoid making your robots too complex or big because it will increase the maintenance costs and time (investigation and shooting) and have an impact on what makes a robot good, its response speed.
As described below, loop processes are mechanisms that each time store session information in the robots, increasing the consumption of memory. Even if you have not experienced problematic cases so far, it is important that you keep in mind this rule to reduce memory consumption when running many robots at the same time .
Use a DB as much as you can when handling big amounts of data.
For example, when voluminous Excel or CSV files are being read by a robot, they are stored in the heap memory of the running robot JVM. However, the heap memory itself has a limited capacity, and inserting big files in it can trigger GC to run or a disk access time, hence impacting the performance of the robot processes.
Get snapshots of Load Page steps immediately after they run.
By taking a snapshot of every page the robot reads between the time it starts operating and the time its behavior stabilizes (no error screen appears during creation), you will accelerate the process of investigating an error then solving it. Determining the reason behind the appearance of screens that only do so at specific times (late at night) or when some conditions are met tends to take some time, but capturing a snapshot and making the robot read the corresponding snapshot file created after an error appears, gives you the power to recreate the error at an early stage.
Specify output values inside one single type.
If you add output values by using additional functions to target improper robot designs or after a robot have started operating, you might try to create new Types and increase the number of Attributes that become Outputs values, in order no to have any effect on the current mechanism. You should however avoid doing that as long as you do not have a clear and valid reason because it is neither always doable nor good for maintainability. (The investigation workload to maintain a robot or to test it after having fixed it increases, and you risk overlooking something.)
Use Write Log steps efficiently to be able to monitor robots behavior in logs.
At locations you expect an error to occur or that you want to use as checkpoints within the process, use Write Log steps to save into logs the content displayed on the screen and variables. This will spare you the time of performing replication tests when debugging in Design Studio.
You should also use Write Log steps at conditional branching points to save the branch execution information into logs, which will help you investigate any malfunction after your robot has started operating.
Do not transform strings of characters that can be used in multiple locations (e.g. file paths used in Write File steps) into variables, and do not set direct strings of characters in Action.
Copying and pasting the same string of characters in multiple settings is a common practice, but you should avoid this as soon as possible because it can be easy to forget a setting to change, to do a mistake and more difficult to spot malfunctions, if you perform maintenance operations at some point.
Select ON for the "Create Directories" option in Write File steps.
If you also want to create a directory along with a file, you need to select ON at the corresponding option. It allows you not to have unexpected effects when creating a file within an existing directory, to avoid errors that impedes you from using the file, but first of all, it will help you not to write files outside of the existing folder.
To save KCUs when registering data in DBs, avoid as much as possible "Execute SQL" and use "Store In Database" instead.
"Execute SQL" consumes 1,000 KCU points per step where "Store In Database" only consumes 1. To save KCU points, it is hence better to use the latter step especially for robots that consume large amounts of KCU for task such as web crawling.
Avoid using variables as global ones as much as you can.
Avoid using Variables as Global ones too easily, except when clearly necessary or with a goal. Extensively using Global Variables makes post-operation debugging difficult and create unexpected malfunctions (hard to trace back).
Use snippets efficiently and make processes of each page independent so that you can build and inspect better.
If you do not move out of a top page processed by a robot, and that have changed because of some execution conditions, you will not be able to open the new target page, making development and unit testing inefficient, and closing the way for sharing development of robots among many people. In such cases, get the HTML page of each page to process, input them into snippets you will create for each screen, and create a robot that links all these snippets at a place where they gather to a certain degree. You will then be able to achieve a simultaneous, shared and more efficient robot development for each screen unit.
Depending on the pages, this can however be insufficient with static HTML files, and would need session information about pages transitions. For these cases, you should build a robotic flow that simply executes the screen transitions until reaching the corresponding page, and save the displayed page in the shared pool using "Save Session". You can create a similar effect as above for the robots that create Snippets, by using HTML files as inputs for "Restore Session".