SQLines Data tool allows you to validate the migrated data in tables. There are two options available:
Row count check is a quick and simple way to check that all rows of all tables were migrated. SQLines Data tool executes SELECT COUNT(*) query for every table and compares the results.
To run the validation by row count, go to Validation tab and select Row Count option, then click Validate:
To run the validation by row count in the command line, use the -cmd=validate and -vopt=rowcount options.
For example:
sqlines -sd=ora,scott/tiger@xe -td=pg,user/pwd@host,db -t=*.* -cmd=validate -vopt=rowcount
Sample output:
Connecting to databases (328 ms) Oracle Database 21c Express Edition Release 21.0.0.0.0 - Production (Ok, 328 ms) PostgreSQL 17rc1 on x86_64-windows, compiled by msvc-19.41.34120, 64-bit (Ok, 140 ms) Reading the database schema (6 tables, 6.5 sec) Validating table row count (4 concurrent sessions): SCOTT.DEPT - Started (1 of 6, session 1) SCOTT.DEPT - Ok (15 ms, session 1) Source: 4 rows (15 ms) Target: 4 rows (15 ms) SCOTT.EMP - Started (2 of 6, session 1) SCOTT.EMP - Ok (16 ms, session 1) Source: 14 rows (16 ms) Target: 14 rows (16 ms) SCOTT.BONUS - Started (3 of 6, session 1) SCOTT.BONUS - Ok (0 ms, session 1) Source: 0 rows (0 ms) Target: 0 rows (0 ms) SCOTT.SALGRADE - Started (4 of 6, session 1) SCOTT.SALGRADE - Ok (0 ms, session 1) Source: 5 rows (0 ms) Target: 5 rows (0 ms) SCOTT.DUMMY - Started (5 of 6, session 1) SCOTT.DUMMY - Ok (0 ms, session 1) Source: 1 row (0 ms) Target: 1 row (0 ms) SCOTT.N1 - Started (6 of 6, session 1) SCOTT.N1 - Ok (0 ms, session 1) Source: 1 row (0 ms) Target: 1 row (0 ms) There are no row count differences. Summary: Total number of tables: 6 With the same row count: 6 With the different row count: 0 Total number of rows: 25 in source, 25 in target Row count difference: 0 Validation time: 297 ms
For more details, see Command Line reference.
For the full data comparison SQLines Data tool reads all rows and columns from the corresponding source and target tables ordered by primary, unique keys or all numeric, string and datetime columns, and compares the results.
To run the full data validation, go to Validation tab and select Data option, then click Validate:
To run the full data validation in the command line, use the -cmd=validate and -vopt=rows options.
For example:
sqlines -sd=ora,scott/tiger@xe -td=pg,user/pwd@host,db -t=*.* -cmd=validate -vopt=rows
Sample output:
Connecting to databases (219 ms) Oracle Database 21c Express Edition Release 21.0.0.0.0 - Production (Ok, 219 ms) PostgreSQL 17rc1 on x86_64-windows, compiled by msvc-19.41.34120, 64-bit (Ok, 94 ms) Reading the database schema (6 tables, 750 ms) Validating table data (4 concurrent sessions): SCOTT.DEPT - Started (1 of 6, session 1) SCOTT.DEPT - Ok (0 ms, session 1) Source: 4 rows (0 ms read, 0 rows/sec, 68 bytes, 0 bytes/sec) Target: 4 rows (0 ms read, 0 rows/sec, 68 bytes, 0 bytes/sec) Source query: SELECT "DEPTNO", "DNAME", "LOC" FROM SCOTT.DEPT ORDER BY DEPTNO, DNAME, LOC Target query: SELECT DEPTNO, DNAME, LOC FROM DEPT ORDER BY DEPTNO, DNAME, LOC SCOTT.EMP - Started (2 of 6, session 1) SCOTT.EMP - Ok (0 ms, session 1) Source: 14 rows (0 ms read, 0 rows/sec, 465 bytes, 0 bytes/sec) Target: 14 rows (0 ms read, 0 rows/sec, 687 bytes, 0 bytes/sec) Source query: SELECT "EMPNO", "ENAME", "JOB", "MGR", "HIREDATE", "SAL", "COMM", "DEPTNO" FROM SCOTT.EMP ORDER BY EMPNO, ENAME, JOB, MGR, HIREDATE, SAL, COMM, DEPTNO Target query: SELECT EMPNO, ENAME, JOB, MGR, HIREDATE, SAL, COMM, DEPTNO FROM EMP ORDER BY EMPNO, ENAME, JOB, MGR, HIREDATE, SAL, COMM, DEPTNO SCOTT.BONUS - Started (3 of 6, session 1) SCOTT.BONUS - Ok (0 ms, session 1) Source: 0 rows (0 ms read) Target: 0 rows (0 ms read) Source query: SELECT "ENAME", "JOB", "SAL", "COMM" FROM SCOTT.BONUS ORDER BY ENAME, JOB, SAL, COMM Target query: SELECT ENAME, JOB, SAL, COMM FROM BONUS ORDER BY ENAME, JOB, SAL, COMM SCOTT.SALGRADE - Started (4 of 6, session 1) SCOTT.SALGRADE - Ok (0 ms, session 1) Source: 5 rows (0 ms read, 0 rows/sec, 44 bytes, 0 bytes/sec) Target: 5 rows (0 ms read, 0 rows/sec, 44 bytes, 0 bytes/sec) Source query: SELECT "GRADE", "LOSAL", "HISAL" FROM SCOTT.SALGRADE ORDER BY GRADE, LOSAL, HISAL Target query: SELECT GRADE, LOSAL, HISAL FROM SALGRADE ORDER BY GRADE, LOSAL, HISAL SCOTT.DUMMY - Started (5 of 6, session 1) SCOTT.DUMMY - Ok (16 ms, session 1) Source: 1 row (16 ms read, 63 rows/sec, 1 bytes, 63 bytes/sec) Target: 1 row (16 ms read, 63 rows/sec, 1 bytes, 63 bytes/sec) Source query: SELECT "DUMMY" FROM SCOTT.DUMMY ORDER BY DUMMY Target query: SELECT DUMMY FROM DUMMY ORDER BY DUMMY SCOTT.N1 - Started (6 of 6, session 1) SCOTT.N1 - Ok (0 ms, session 1) Source: 1 row (0 ms read, 0 rows/sec, 1 bytes, 0 bytes/sec) Target: 1 row (0 ms read, 0 rows/sec, 4 bytes, 0 bytes/sec) Source query: SELECT "C1" FROM SCOTT.N1 ORDER BY C1 Target query: SELECT C1 FROM N1 ORDER BY C1 Summary: Tables: 6 (6 compared, 0 failed) Equal tables: 6 Different tables: 0 Validation time: 1.3 sec
For more details, see Command Line reference.
If there are issues with the full data validation, you can create dump files containing differences in data found during data validation. Edit sqldata.cfg file and specify -trace_diff_data=yes (the option can be specified in the command line as well).
The SQLines Data tool generates <table>.diff.txt for every table having data validation issues.