Launchers / Automated tests

SQL Compare Launcher (sql_compare.jar)

The SQL Compare launcher allows to connect to o1 or 2 different databases, optionally remotely run a script (i.e. to prepare some data or run a consolidation job or wathever...) and compare data returned by the 2 SQL scripts.

Configuration

The sql_compare.xml file is just a template and must NOT be edited. It's used by the system to build dynamically the form that the user will be able to fill in from the GUI when creating a custom execution configuration.

Parameter Description
SQL
sql script root path This must indicate where are located all the SQL scripts.
This is a root path. Each test in XStudio has a canonical path that will be appended to this path.
This path MUST not include an ending slash.

Default value is: C:/test_repository/tests/sql
Connection1
connection url This must indicate the connection URL to connnect to the first database.

Default value is: jdbc:mysql://192.168.10.1/test_db
username This must indicate the username to connect to the database.

Default value is: my_username
password This must indicate the password to connect to the database.

Default value is: my_password
driver jar This must indicate the driver to use to connect to the database (depending on the database server this may be a MySQL, PostgreSQL, Oracle or any other JDBC driver).

Default value is: Y:/mysql-connector-java-3.1.12/mysql-connector-java-3.1.12-bin.jar
driver classpath This must indicate the driver class path to instantiate.

Default value is: com.mysql.jdbc.Driver
write all values This must indicate if you want to log all the values read on each database in XStudio (this can be a big amount of data).

Default value is: 1 (true)
Connection2
connection url This must indicate the connection URL to connnect to the first database.

Default value is: jdbc:mysql://192.168.10.1/test_db
username This must indicate the username to connect to the database.

Default value is: my_username
password This must indicate the password to connect to the database.

Default value is: my_password
driver jar This must indicate the driver to use to connect to the database (depending on the database server this may be a MySQL, PostgreSQL, Oracle or any other JDBC driver).

Default value is: Y:/mysql-connector-java-3.1.12/mysql-connector-java-3.1.12-bin.jar
driver classpath This must indicate the driver class path to instantiate.

Default value is: com.mysql.jdbc.Driver
write all values This must indicate if you want to log all the values read on each database in XStudio (this can be a big amount of data).

Default value is: 1 (true)
Scripts
run command remotely before running test This must indicate if you need to run remotely a script on a server using plink before running the test cases.

Default value is: 0 (false)
plink executable This must indicate the path to the plink executable.

Default value is: C:/progra~1/Putty/plink.exe -batch
remote host This must indicate the on which remote host to run the command.

Default value is: 10.1.1.1
username This must indicate the username to connect to the remote host.

Default value is: my_username
password This must indicate the password to connect to the remote host.

Default value is: my_password
command This must indicate the command to execute remotely.

Default value is: grep /~eric/ /var/log/httpd/access.log > fredlog

These values can be changed while creating the campaign session from XStudio.
Note about file path parameters:
Any parameter referring to a file or folder path (for instance Test root path) can be provided either using \ separator (if the tests are going to be executed on a Windows agent) or / separator (if the tests are going to be executed on a linux or MacOSX agent).

On windows, if you provide a path containing an OS-localizable folder such as C:\Program Files, always prefer the English version (i.e. NOT C:\Programmes if you're using a french-localized Windows) or the corresponding native environment variable (i.e. %PROGRAMFILES%).


Parameter Description
SQL > sql script root path This must indicate where are located all the SQL Compare tests scripts.
This is a root path. Each test in XStudio has a canonical path that will be appended to this path.
This path MUST not include an ending slash.

Default value is: Y:/XStudio/src
 
Connection1 > connection url This must indicate the connection URL to connnect to the first database.

Default value is: jdbc:mysql://192.168.10.1/test_db
Connection1 > username This must indicate the username to connect to the database.

Default value is: my_username
Connection1 > password This must indicate the password to connect to the database.

Default value is: my_password
Connection1 > driver jar This must indicate the driver to use to connect to the database (depending on the database server this may be a MySQL, PostgreSQL, Oracle or any other JDBC driver).

Default value is: Y:/mysql-connector-java-3.1.12/mysql-connector-java-3.1.12-bin.jar
Connection1 > driver classpath This must indicate the driver class path to instantiate.

Default value is: com.mysql.jdbc.Driver
Connection1 > write all values This must indicate if you want to log all the values read on each database in XStudio (this can be a big amount of data).

Default value is: 1 (true)
 
Connection2 > connection url This must indicate the connection URL to connnect to the first database.

Default value is: jdbc:mysql://192.168.10.1/test_db
Connection2 > username This must indicate the username to connect to the database.

Default value is: my_username
Connection2 > password This must indicate the password to connect to the database.

Default value is: my_password
Connection2 > driver jar This must indicate the driver to use to connect to the database (depending on the database server this may be a MySQL, PostgreSQL, Oracle or any other JDBC driver).

Default value is: Y:/mysql-connector-java-3.1.12/mysql-connector-java-3.1.12-bin.jar
Connection2 > driver classpath This must indicate the driver class path to instantiate.

Default value is: com.mysql.jdbc.Driver
Connection2 > write all values This must indicate if you want to log all the values read on each database in XStudio (this can be a big amount of data).

Default value is: 1 (true)
 
Scripts > run command remotely before running test This must indicate if you need to run remotely a script on a server using plink before running the test cases.

Default value is: 0 (false)
Scripts > plink executable This must indicate the path to the plink executable.

Default value is: C:/progra~1/Putty/plink.exe -batch
Scripts > remote host This must indicate the on which remote host to run the command.

Default value is: 10.1.1.1
Scripts > username This must indicate the username to connect to the remote host.

Default value is: my_username
Scripts > password This must indicate the password to connect to the remote host.

Default value is: my_password
Scripts > command This must indicate the command to execute remotely.

Default value is: grep /~eric/ /var/log/httpd/access.log > fredlog

Process

The tests are executed by the launcher using the following procedure:

  • If necessary, connect to a remote host using plink with specific credentials (in the configuration) and execute the command specified (as well in the configuration)
  • Establish 2 separate connections to the databases specified in the configuration
  • Execute the sql script <sqlScriptRootPath>/<testPath>/<testName>_<testcaseName>_a.sql on the first database
  • Execute the sql script <sqlScriptRootPath>/<testPath>/<testName>_<testcaseName>_b.sql on the second one
  • Compare both result sets

The test will be marked as passed or failed depending on the comparison between the 2 result sets. If the result sets are exactly identical, a success is logged. The logs and the execution trace of the scripts are also attached to the testcase execution in XStudio.
If some parameters are set on your test cases, the launcher will search for the parameter name in the queries and will replace them with the value of the parameter if it finds some. This allows to customize some query templates based on the parameterization of each test case.

Tutorial: Creating and executing SQL Compare tests

In this tutorial, we will learn to run some SQL Compare tests.

Prerequisites

Create a MySQL database and run the following script to populate it with example data:
CREATE TABLE table_stock_1 (
  article_id INTEGER,
  article_name TEXT,
  article_color TEXT,
  article_nb INTEGER,
  PRIMARY KEY(article_id)
);

INSERT INTO table_stock_1 VALUES
(1, 'Hat', 'Red', 10),
(2, 'Hat', 'Blue', 4),
(3, 'Hat', 'Green', 17),
(4, 'Sweat-shirt', 'Red', 5);

CREATE TABLE table_stock_2 (
  article_id INTEGER,
  article_name TEXT,
  article_color TEXT,
  article_nb INTEGER,
  PRIMARY KEY(article_id)
);

INSERT INTO table_stock_2 VALUES
(1, 'Hat', 'Red', 10),
(2, 'Hat', 'Blue', 4),
(3, 'Hat', 'Green', 17),
(4, 'Sweat-shirt', 'Red', 5);

CREATE TABLE table_stock_3 (
  article_id INTEGER,
  article_name TEXT,
  article_color TEXT,
  article_nb INTEGER,
  PRIMARY KEY(article_id)
);

INSERT INTO table_stock_3 VALUES
(1, 'Hat', 'Red', 10),
(2, 'Hat', 'Blue', 3),
(4, 'Sweat-shirt', 'Red', 6);


Then create 2 test cases scripts:
Test case 1:
  • Create the text file C:\\test_repository\\tests\\sql_compare\\folder1\\test1_testcase1_a.sql and write in it:
    SELECT * FROM table_stock_1;
  • Create the text file C:\\test_repository\\tests\\sql_compare\\folder1\\test1_testcase1_b.sql and write in it:
    SELECT * FROM table_stock_2;

Test case 2:
  • Create the text file C:\\test_repository\\tests\\sql_compare\\folder1\\test1_testcase2_a.sql and write in it:
    SELECT * FROM table_stock_1;
  • Create the text file C:\\test_repository\\tests\\sql_compare\\folder1\\test1_testcase2_b.sql and write in it:
    SELECT * FROM table_stock_3;
The test case "testcase1" of "test1" is going to succeed as it will compare the result of a simple SELECT * from two identical tables: table_stock_1 and table_stock_2 while the second test case will fail.

Create a dedicated category for SQL Compare tests and create a test

  • create a category SQL Compare associated to the launcher sql_compare.jar
  • under this category, create (somewhere in the tree) a test with name test1 and with a canonical path set to folder1.
  • create 2 test cases under this test named testcase1 and testcase1 (this is what is used to find the right script on disk).

Creating a test campaign

  • create a campaign including only the test test1
  • create a campaign session specifying in the configuration:
    • sql scripts root path: C:/test_repository/tests/sql_compare
    • the same driver, classpath and credential to connect to the test database (connection 1 and 2)

Run a campaign session

Run the campaign session

Permissions

WARNING: if you're running your tests on Windows, it may be required to run the tests as administrator.
Having an account with Administrators permissions may even not be enough in some cases (especially if you're using Windows 10) and you may need to disable completely the UAC (User Access Control) on your computer.
To do so:
  • Press the Windows + R key combination
  • Type in regedit
  • Go to HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Policies\System
  • In the right-side pane, look for EnableLUA and set the value 0
  • Close the registry editor
  • Restart your computer

Debug

If your tests are not executed correctly or are reporting only failures, this is very likely because your configuration is incorrect or because you used a wrong naming convention for your tests and test cases.

The best way to quickly find out what's wrong is to look at the traces generated by XStudio (or XAgent).
The traces always include the detailed description of what the launcher performs (command line execution, script execution, API calling etc.) to run a test case. So, if you experiment some problems, the first thing to do is to activate the traces and look at what's happening when you run your tests.

Then, try to execute manually in a cmd box the exact same commands.
This will normally fail the same way.
At this point, you needs to figure out what has to be changed in these commands in order to have them run properly.

When you have something working, compare these commands to what's described in the Process chapter above. This will tell you exactly what you need to change.

Most of the time, this is related to:
  • some incorrect values in some parameters of your configuration,
  • the name of your tests,
  • the name of your test cases,
  • the canonical path of your tests