SQLines Data is a high performance data transfer, schema conversion and migration validation tool that supports major enterprise databases:
SQLines Data is written in C/C++ and uses native low-level in-memory bulk loader APIs to transfer data. It is available for Linux and Windows.
The tool provides a powerful command line interface with a lot of options to help you automate and customize data transfer and transformation tasks.
You can use SQLines Data for cross-platform database migration. The tool migrates table definitions, constraints, indexes and transfers data. Use SQLines SQL Converter tool to convert SQL scripts including stored procedures and functions.
Migration Features:
You can use SQLines Data in your ETL processes as a data extract-load tool.
ETL Features:
Often you need to change the database schemas during the migration, see -smap option how to do this.
Often you also need to change the table names during the migration, you can use it modifying sqlines_tmap.txt supplied with the tool.
Every line of this file contains a single table name mapping:
source_name, target_name |
source_name2, target_name2 |
… |
Use fully qualified table name to specify source tables, and for Oracle, IBM DB2 it should be in the upper case.
If you need to change the column names and their data types you can use sqlines_cmap.txt supplied with the tool to define the column mapping.
Every line of this file contains a single column name mapping as follows:
schema.table, column, target_column [, target_datatype] |
The target datatype is optional, and if you do not specify it then only the column name is mapped and the default data type conversion is used
For example:
SALES.CONTACTS, DESCRIPTION, DESC, VARCHAR2(4000) |
Use fully qualified table name to specify source tables, and for Oracle, IBM DB2 it should be in the upper case: SCHEMA.TABLE
Sometimes you need to change the default data type mapping provided by SQLines Data tool. You can use sqlines_dtmap.txt file to define your own mapping.
For example, by default SQLines Data converts Oracle VARCHAR2(4000) to VARCHAR(4000) in MariaDB. You can change mapping to TEXT by defining:
VARCHAR2(4000), TEXT |
in the sqlines_dtmap.txt. For more details, see -dtmapf configuration option.
This configuration file provides the global data type mapping i.e. this mapping will be used for all tables.
Sometime you do not need to move constraints and indexes into the target database. For example, in case when you move data for reporting purpose, or if you want to migrate constraints and indexes later.
Edit sqldata.cfg configuration file, and set -constraints=no and -indexes=no.
Use sqlines_tsel.txt and -tself option to define which columns to transfer, change the column order, transform column values and add new computed columns based on specified expressions.
For examle:
-- Syntax: -- source_table, colexpr1, ...; (use ; to terminate the list, you can use newlines and tabs in expression items) -- For Oracle, DB2 specify fully qualified name in uppercase: SCHEMA.TABLE SALES.CONTACTS, NAME, SUBSTR(CREATED_DT, 1, 10) AS CREATED_DT, 'HQ OFFICE' AS OFFICE; SALES.ORDERS, *, 'NA REGION' AS REGION; -- adds a new column with computed value
If nothing is specified for the table all its columns transferred as they are defined in the source database.
Edit sqlines_twhere.txt file and specify an individual filter for any source table. If a filter is not specified for the table all its data are transferred.
For example:
-- Syntax: -- source_table1, condition1; (do not specify WHERE keyword, use ; to terminate the WHERE clause) -- source_table2, condition2; ... ; dbo.Documents, [create_dt] BETWEEN CAST('2024-07-17' AS DATE) AND CAST('2024-07-18' AS DATE);
For more details, see -twheref option.
Edit sqlines_tsel.txt file and specify column expressions (using SQL dialect of the source database), add/remove columns, define computed columns and so on.
During the database migration some tables may fail to transfer. SQLines Data generates sqldata_failed_tables.txt file with the list of failed tables. Using -tf=sqldata_failed_tables.txt command line option you can re-run the transfer for the failed tables only.
In addition to specify tables, you can also specify any ad-hoc queries to define which data to transfer. Use -qf option to define your queries.
When importing data into MySQL, MariaDB or PostgreSQL you can add command line option -trace_data=yes to generate the dump files i.e. data that were sent to MySQL, MariaDB LOAD DATA INFILE and PostgreSQL COPY commands. It can help debug data transfer issues if any.
If you need to transfer DDL (tables, constraints, indexes etc.) without transferring data, set -data=no in the command line or sqldata.cfg configuration file.
You can validate the data transfer using SQLines Data tool, please check Data Validation.
If you have any questions, please contact us.