-
Notifications
You must be signed in to change notification settings - Fork 4
DB2
https://www.ibm.com/analytics/us/en/db2/trials/
It is also quite easy to install private Docker version od DB2.
https://github.com/stanislawbartkowski/docker-db2
or
https://github.com/stanislawbartkowski/javahotel/tree/dockerdb2
Look also, it is IBM officially supported DB2 docker method.
https://www.ibm.com/cloud/blog/announcements/ibm-db2-developer-community-edition
http://www-01.ibm.com/support/docview.wss?uid=swg21385217
Prepare a separate database and user authorized to run a test.
db2 create database perfdb
Catalog remote node and database. Example assuming host:thinkde, port:50000, and database:perfdb.
db2 catalog tcpip node thinkde remote thinkde server 50000
db2 catalog database perfdb at node thinkde
On the node where DB2 is installed create a user perf. The user should have LOAD authority on the database.
db2 "GRANT LOAD ON DATABASE TO USER PERF"
The standard db2inst1 user can be used as well, in this case not necessary to grant any additional privileges.
Verify connection
db2 connect to perfdb user perf
Enter current password for perf:
Database Connection Information
Database server = DB2/LINUXX8664 11.1.0
SQL authorization ID = PERF
Local database alias = PERFDB
db2 "create table x (x int)"
DB20000I The SQL command completed successfully.
db2 "drop table x"
DB20000I The SQL command completed successfully.
https://github.com/stanislawbartkowski/mytpcds/blob/master/env.templates/db2.rc
Parameter | Description | Default value |
---|---|---|
DBNAME | Cataloged database name | perfdb |
DBUSER | Database username | perf |
DBPASSWORD | Database user password | secret |
CLIENT | Qualifier for LOAD command. If set as client then data is loaded from the client host, not server | client |
DBTYPE | db2 | |
DBURL | JDBC URL string | "jdbc:db2://thinkde:50000/$DBNAME" or "jdbc:db2://thinkde:50000/$DBNAME:currentSchema=$SCHEMA;" Important: it is case-sensitive, use currentSchema. |
JAVADRIVER | JDBC driver jar file path | /home/db2inst1/sqllib/java/db2jcc4.jar |
SCHEMA | Optional, if defined then data is loaded into schema | PERFDB
SQL0964C The transaction log for the database is full. SQLSTATE=57011
Increase the size of LOG files
db2 update db cfg for perfdb using LOGPRIMARY 40
db2 update db cfg for perfdb using LOGSECOND 40
Prepare db2.tpl file
define __LIMITA = "";
define __LIMITB = "";
define __LIMITC = " fetch first %d rows only";
define _BEGIN = "-- start query " + [_QUERY] + " in stream " + [_STREAM] + " using template " + [_TEMPLATE];
define _END = "-- end query " + [_QUERY] + " in stream " + [_STREAM] + " using template " + [_TEMPLATE];
Generate qualification queries, https://github.com/stanislawbartkowski/mytpcds#qualify-queries
- Uncomment ./tpc.sh queryqualification line
- ./run.sh
home/sb/v2.13.0rc1//query_templates/query97.tpl replace DMS => 1200
/tmp/tmp.6FtMpVCq8r
/home/sb/v2.13.0rc1//query_templates/query98.tpl replace YEAR => 1999
/home/sb/v2.13.0rc1//query_templates/query98.tpl replace SDATE => 1999-02-22
/home/sb/v2.13.0rc1//query_templates/query98.tpl replace CATEGORY.1 => Sports
/home/sb/v2.13.0rc1//query_templates/query98.tpl replace CATEGORY.2 => Books
/home/sb/v2.13.0rc1//query_templates/query98.tpl replace CATEGORY.3 => Home
/tmp/tmp.6FtMpVCq8r
/home/sb/v2.13.0rc1//query_templates/query99.tpl replace DMS => 1200
/tmp/tmp.6FtMpVCq8r
qgen2 Query Generator (Version 2.13.0)
Copyright Transaction Processing Performance Council (TPC) 2001 - 2020
Parsed 99 templates
/tmp/tmp.6FtMpVCq8r
PASSED
./dsqgen -VERBOSE Y -DIRECTORY ../query_templates -INPUT ../query_templates/templates.lst -OUTPUT_DIR ../work/db2queries -DIALECT db2 -STREAMS 4 -sc 100
QUALIFY | RESULT |
---|---|
SQL coverage | 99 queries |
Qualify test | MATCHES 91 DIFFER 8 |
Linux pipes can be used when disk space is in a short supply and there is no space in db2u container or host for TPC/DS CSV files.
Assume the following topology
- NFS storage having TPC/DS text delimited CSV files
- TPC/DS client desktop. Should have access to TPC/DS files to verify the number of records after the load. For instance: /mnt/usb1/repo/tpcdata/data100
- Host with DB2u podman/docker version installed. Identify local scratch directory accessible externally and internally by DB2u instance. For instance, external access: /var/db2ce/backup/scratch, internal: /database/backup/scratch/
- Access to TPC/DS data on NFS from the DB2u : /mnt/ausb1/repo/tpcdata/data100
Copy mytpc/proc/gopipe.sh script file to DB2u host machine and configure.
Parameter | Description | Sample value |
---|---|---|
INDIR | Access to TCP/DS CSV file | /mnt/ausb1/repo/tpcdata/data100 |
OUTDIR | External access to DB2u scratch directory | /var/db2ce/backup/scratch |
vi gopipe.sh
INDIR=/mnt/ausb1/repo/tpcdata/data100
OUTDIR=/var/db2ce/backup/scratch
....
In env/db2cli.rc file, set additional setting. This parameter will cause loading from LOADSERVERDIR server DB2u directory.
LOADSERVERDIR=/database/backup/scratch
There are several supporting procedures in gopipe.sh script file.
- remove : removes all pipe entries in OUTDIR directory
- createpipe: creates pipe entries in OUTDIR directory
- catpipetable : cat a single table as a test*
- catpipe : cat all CSV tables
remove
createpipe
catpipetable customer
remove
createpipe
catpipe