You are on page 1of 0

BTEQ

After completing this module, you will be able to:


Use .EXPORT to SELECT data from the Teradata database
to another computer.
State the purpose of the four types of BTEQ EXPORT.
Use .IMPORT to process input from a host-resident data
file.
Use Indicator Variables to preserve NULLs.
Describe multiple sessions, and how they make parallel
access possible.
BTEQ
Batch-mode utility for submitting SQL requests to the Teradata database.
Runs on every supported platform laptop to mainframe.
Flexible and easy-to-use report writer.
Exports data to a client system from the Teradata database:
As displayable characters suitable for reports, or
In native host format, suitable for other applications.
Reads input data and imports it to the Teradata database as INSERTs,
UPDATEs or DELETEs.
Limited ability to branch forward to a LABEL, based on a return code or an
activity count.
BTEQ does error reporting, not error capture.
Using BTEQ Conditional Logic
The Bank offers a number of special services to its Million-Dollar customers.
DELETE FROM Million_Dollar_Customer ALL;
.IF ERRORCODE = 0 THEN .GOTO TableOK
CREATE TABLE Million_Dollar_Customer
(Account_Number INTEGER
,Customer_Last_Name VARCHAR(20)
,Customer_First_Name VARCHAR(15)
,Balance_Current DECIMAL(9,2));
.LABEL TableOK
INSERT INTO Million_Dollar_Customer
SELECT A.Account_Number, C.Last_Name, C.First_Name, A.Balance_Current
FROM Accounts A INNER JOIN
Account_Customer AC INNER JOIN
Customer C
ON C.Customer_Number = AC.Customer_Number
ON A.Account_Number = AC.Account_Number
WHERE A.Balance_Current GT 1000000;
.IF ACTIVITYCOUNT > 0 THEN .GOTO Continue
.QUIT
.LABEL Continue
DELETE all rows from the Million_Dollar_Customer table.
IF this results in an error (non-zero), THEN create the table, ELSE attempt to populate using
INSERT/SELECT.
IF some rows are inserted (ACTIVITYCOUNT>0) THEN arrange services, ELSE terminate the job.
BTEQ Error Handling
.SET ERRORLEVEL 2168 SEVERITY 4,
(2173, 3342, 5262) SEVERITY 8
.SET ERRORLEVEL UNKNOWN SEVERITY 16
SELECT . . . . . . . . . . . . .
FROM . . . . . . . . . . . . . ;
.IF ERRORLEVEL >= 14 THEN .QUIT 17 ;
You can assign an error level (SEVERITY) for each error code returned and make
decisions based on the level you assign.
Capabilities:
Customize mapping from error code to ERRORLEVEL.
Global ERRORLEVEL threshold for deferring to .IF or terminating.
.SET MAXERROR <integer> defines termination threshold.
BTEQ EXPORT
BTEQ
Note: For a channel-attached host,
the output file is specified as
DDNAME instead of FILE.
.LOGON tdp1/user1,passwd1
.EXPORT DATA FILE=/home2/user1/datafile1
SELECT Account_Number
FROM Accounts
WHERE Balance_Current LT 100;
.EXPORT RESET
.QUIT
export1.btq
bteq < export1.btq
Data file of
Account
Numbers
12348009
19450824
23498763
23748091
85673542
19530824
92234590
:
Logon complete
1200 Rows returned
Time was 15.25 seconds
BTEQ Script
Default Output
datafile1
BTEQ .EXPORT
.EXPORT DATA FILE filename A
INDICDATA = , LIMIT = n
REPORT DDNAME ddname
=
DIF
DATALABELS
RESET
A
, OPEN AXSMOD
, CLOSE modname 'init_string'
.EXPORT DATA Sends results to a host file in record mode.
.EXPORT INDICDATA Sends query results that contain indicator variables to a host file. Allows Host
programs to deal with nulls.
.EXPORT REPORT Sends results to a host file in field mode.
Data set contains column headings and formatted data. Data is truncated if
exceeds 254 (REPORT).
.EXPORT DIF Output converted to Data Interchange Format, used to transport data to
various PC programs, such as Lotus 1-2-3.
.EXPORT RESET Reverses the effect of a previous .EXPORT and closes the output file.
LIMIT n Sets a limit on number of rows captured.
OPEN/CLOSE Output Data Set or File is either OPEN or CLOSEd during RETRY
AXSMOD Access module used to export to tape
BTEQ .EXPORT Script Example
.LOGON tdp1/user1,passwd1
.EXPORT DATA FILE=/home2/user1/datafile2, LIMIT=100
SELECT Account_Number
FROM Accounts
WHERE Balance_Current < 500 ;
.EXPORT RESET
.QUIT
export2.btq
To execute this script in a UNIX environment:
$ bteq < export2.btq | tee export2.out
*** Success, Stmt# 1 ActivityCount = 330
*** Query completed. 330 rows found. 1 column returned.
*** Total elapsed time was 1 second.
*** Warning: RetLimit exceeded.
Ignoring the rest of the output.
*** Output returned to console
$
datafile2 contains 100 account numbers
export2.out contains the output
informational text sent to the console
BTEQ Data Modes
Field mode is set by : .EXPORT REPORT
Column A Column B Column C
1 2 3
4 5 6
7 8 9
Transfers data one column at a
time with numeric data
converted to character.
Record mode is set by : .EXPORT DATA
f1 f2 f3
Transfers data one row at a
time in host format. Nulls are
represented as zeros or
spaces.
f1 f2 f3
f1 f2 f3
Indicator mode is set by: .EXPORT INDICDATA
Transfers data one row at a
time in host format, sending an
indicator variable for nulls.
Nulls are represented as zeros
or spaces.
Indic. Byte(s) f1 f2 f3
Indic. Byte(s) f1 f2 f3
Indic. Byte(s) f1 f2 f3
Indicator Variables
Indicator variables allow utilities to process records that contain NULL indicators.
INDICATORS causes leading n bytes of the record as NULL indicators instead of data.
NULL Columns
Field 3 is null, Field 5 is null
00101000 00000000 F1 F2 F3 F4 F5 F6 F12
.EXPORT INDICDATA
.[SET] INDICDATA [ON]
INDICATORS ON
INDICATORS
Bteq
FastLoad
MultiLoad
FastExport
TPump
Determining the Logical Record Length with
Fixed Length Columns
J o n e s J a c k
0 0 0 0 D 9 9 8 A 4 4 4 D 8 8 9 4 4 4 4 0 0 0 0 0 0 7 3 ...
0 0 0 6 1 6 5 5 2 0 0 0 1 1 3 2 0 0 0 0 0 0 0 0 0 9 A B
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28
Customer # Last Name First Name Social Security Birth Date
Length
CREATE TABLE Customer, FALLBACK
(Customer_Number INTEGER 4
,Last_Name CHAR(8) 8
,First_Name CHAR(8) 8
,Social_Security INTEGER 4
,Birth_Date DATE 4
,OD_Limit DECIMAL(7,2)) 4
UNIQUE PRIMARY INDEX (Customer_Number);
Total 32
Determining the Logical Record Length with
Variable Length Columns
Length
CREATE TABLE Customer, FALLBACK
(Customer_Number INTEGER 4
,Last_Name VARCHAR(8) 10
,First_Name VARCHAR(8) 10
,Social_Security INTEGER 4
,Birth_Date DATE 4
,OD_Limit DECIMAL(7,2)) 4
UNIQUE PRIMARY INDEX (Customer_Number);
Total 36
J o n e s J a c k
0 0 0 0 0 0 D 9 9 8 A 0 0 D 8 8 9 0 0 0 0 0 0 7 3 1 0 0 0
0 0 0 6 0 5 1 6 5 5 2 0 4 1 1 3 2 0 0 0 0 0 0 A B 0 0 0 C
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
Customer # Last Name First Name Social Security Birth Date OD Limit
Last_Name and First_Name redefined each as VARCHAR(8) reduces storage by 7 spaces,
but adds two 2-byte length fields.
Determining the Logical Record Length with
.EXPORT INDICDATA
Length
CREATE TABLE Customer, FALLBACK
(Customer_Number INTEGER 4
,Last_Name VARCHAR(8) 10
,First_Name VARCHAR(8) 10
,Social_Security INTEGER 4
,Birth_Date DATE 4
,OD_Limit DECIMAL(7,2)) 4
UNIQUE PRIMARY INDEX (Customer_Number);
Total 37
All 6 columns are nullable adding 6 bits (1 byte) when using .EXPORT in INDICDATA mode.
Therefore, the length equals 37. Assume in this example that Social Security is NULL.
Assume
NULL for
Social
Security
J o n e s J a c k
1 0 0 0 0 0 0 D 9 9 8 A 0 0 D 8 8 9 0 0 0 0 0 0 7 3 1 0 ...
0 0 0 0 6 0 5 1 6 5 5 2 0 4 1 1 3 2 0 0 0 0 0 0 A B 0 0 ...
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27
Customer # Last Name First Name Social Security Birth Date
Indicator
Byte
.IMPORT
(for Channel-Attached Systems)
IMPORT loads data from the host to the Teradata database with a USING clause.
INDICDATA preserves nulls.
.IMPORT DATA DDNAME ddname
INDICDATA FILE = ,SKIP = n
.IMPORT DATA Reads a host file in record mode.
.IMPORT INDICDATA Reads data in host format using indicator
variables in record mode to identify nulls.
DDNAME Name of MVS JCL DD statement or CMS
FILEDEF.
FILE Name of input data set in all other environments.
SKIP = n Number of initial records from the data stream
that should be skipped before the first row is
transmitted.
.IMPORT
(for Network-Attached Systems)
.IMPORT DATA FILE filename
INDICDATA DDNAME = ,SKIP = n
REPORT
VARTEXT
|
'c'
AXSMOD
modname 'init_string'
DATA imports data from the server to Teradata with a USING clause.
INDICDATA import records contain NULL bits.
REPORT imports Teradata report data. Data expected in BTEQ EXPORT REPORT
format.
VARTEXT record format as variable length character fields. Default delimiter is | or
specify with field delimiter within single quotes.
AXSMOD access module used to import from tape.
BTEQ IMPORT
(Data Load from the Host)
.LOGON tdp1/user1,passwd1
.IMPORT DATA DDNAME = datain3 ;
.QUIET ON
.REPEAT *
USING in_CustNo (INTEGER)
, in_SocSec (INTEGER)
, Filler (CHAR(30))
, in_Lname (CHAR(20))
, in_Fname (CHAR(10))
INSERT INTO Customer
( Customer_Number
, Last_Name
, First_Name
, Social_Security )
VALUES ( :in_CustNo
, :in_Lname
, :in_Fname
, :in_SocSec)
;
.QUIT
.QUIET
Limits output to
reporting only errors
and request
processing statistics.
.REPEAT *
Causes BTEQ to read
records until EOF.
USING
Defines the input data
from the host.
BTEQ IMPORT
(from a UNIX Environment)
bteq
Enter your BTEQ Command:
.RUN FILE = jobscript.btq
or
bteq < jobscript.btq | tee jobscript.out
jobscript.btq
. LOGON tpd1/user1,passwd1
. IMPORT DATA FILE = /home2/user1/datafile4
. QUIET ON
. REPEAT *
USING in_CustNo (INTEGER)
, in_SocSec (INTEGER)
UPDATE Customer
SET Social_Security = :in_SocSec
WHERE Customer_Number = :in_CustNo
;
.QUIT;
.QUIET
Limits output to
reporting only errors
and request
processing statistics.
.REPEAT *
Causes BTEQ to read
records until EOF.
USING
Defines the input data
from the UNIX server.
BTEQ IMPORT
(from a PC-Connected LAN)
. LOGON tdp1/user1,passwd1
. IMPORT DATA FILE = E:\datafile5
. QUIET ON
. REPEAT *
USING in_CustNo (INTEGER)
,in_SocSec (INTEGER)
UPDATE Customer
SET Social_Security = :in_SocSec
WHERE Customer_Number = :in_CustNo ;
.QUIT
import1.btq
From Teradata Command Prompt on PC:
c:\ bteq
Teradata BTEQ 08.02.00.00 for WIN32.
Enter your logon or BTEQ command:
.RUN FILE = c:\td_scripts\import1.btq
Multiple Sessions
Session:
Logical connection between host and Teradata database.
Workstream composed of a series of requests between the host and the
database.
Multiple sessions:
Allow tasks to be worked on in parallel.
Require row hash locking for parallel processing: UPI, NUPI, USI
transactions.
Too few degrade performance.
Too many will not improve performance.
Initializing a single session typically takes 1 to 2 seconds.
.SET SESSIONS
.SET SESSIONS 8
.LOGON tdp1/user1,passwd1
.IMPORT DATA DDNAME=datain6
.QUIET ON
.REPEAT *
USING in_CustNo (INTEGER)
, in_SocSec (INTEGER)
, Filler (CHAR(30))
, in_LName (CHAR(20))
, in_FName (CHAR(10))
INSERT INTO Customer
( Customer_Number
, Last_Name
, First_Name
, Social_Security )
VALUES ( :in_CustNo
,:in_LName
,:in_FName
,:in_SocSec )
;
.QUIT
Parallel Processing Using Multiple Sessions to Access
Individual Rows
Although a single row can reside
on 1 AMP, it might require a full
table scan to locate it. All AMPs
are required to participate.
TXN 2
VDisk
TXN 1
TXN 2
TXN 3
TXN 2 TXN 2
VDisk VDisk VDisk
If the location of the row is known,
only the applicable AMP needs to
be involved. Other AMPs can work
on other tasks multiple
sessions are useful.
TXN 5
TXN 9
TXN 1
TXN 6
TXN 10
TXN 2
TXN 7
TXN 11
TXN 3
TXN 8
TXN 12
TXN 4
VDisk VDisk VDisk VDisk
Multiple transactions execute in parallel, provided that:
Each transaction uses fewer than all AMPs.
Enough are sent to keep ALL AMPs busy.
Each parallel transaction has a unique internal ID.
When Do Multiple Sessions Make Sense?
TRANS_HISTORY
Account_Number Trans_Number
Trans_Date
Trans_ID Amount
PK FK,NN
NUPI USI
NUSI
Multiple sessions improve performance ONLY for SQL requests that impact fewer than
ALL AMPs.
Which of the following batch requests would benefit from multiple sessions?
1. INSERT INTO Trans_History
VALUES (:T_Nbr, DATE, :Acct_Nbr, :T_ID, :Amt);
2. SELECT * FROM Trans_History
WHERE Trans_Number=:Trans_Number;
3. DELETE FROM Trans_History
WHERE Trans_Date < DATE - 120;
4. DELETE FROM Trans_History
WHERE Account_Number= :Account_Number;
Trans Table or Multiple Sessions
Type Row Lock Useful or Not?
When Do Multiple Sessions Make Sense?
TRANS_HISTORY
Account_Number Trans_Number
Trans_Date
Trans_ID Amount
PK FK,NN
NUPI USI
NUSI
Multiple sessions improve performance ONLY for SQL requests that impact fewer than
ALL AMPs.
Which of the following batch requests would benefit from multiple sessions?
1. INSERT INTO Trans_History
VALUES (:T_Nbr, DATE, :Acct_Nbr, :T_ID, :Amt);
2. SELECT * FROM Trans_History
WHERE Trans_Number=:Trans_Number;
3. DELETE FROM Trans_History
WHERE Trans_Date < DATE - 120;
4. DELETE FROM Trans_History
WHERE Account_Number= :Account_Number;
Trans Table or Multiple Sessions
Type Row Lock Useful or Not?
NUPI Row Hash Yes
NUSI Full Table No
FTS Full Table No
NUPI Row Hash Yes
Application Utility Checklist
Feature BTEQ FastLoad FastExport MultiLoad TPump
DDL Functions ALL
DML Functions ALL
Multiple DML Yes
Multiple Tables Yes
Multiple Sessions Yes
Protocol Used SQL
Conditional Expressions Yes
Arithmetic Calculations Yes
Data Conversion Yes
Error Files No
Error Limits No
User-written Routines No
Review Questions
Answer True or False.
1. True or False. With BTEQ you can import data from the host to Teradata AND export from Teradata to
the host.
2. True or False. .EXPORT DATA sends results to a host file in field mode.
3. True or False. INDICDATA is used to preserve nulls.
4. True or False. With BTEQ, you can use conditional logic to bypass statements based on a test of an
error code.
5. True or False. It is useful to employ multiple sessions when ALL AMPS will be used for the
transaction.
6. True or False. With .EXPORT, you can have output converted to a format that can be used with PC
programs.
Review Question Answers
Answer True or False.
1. True or False. With BTEQ you can import data from the host to Teradata AND export from Teradata to
the host.
2. True or False. .EXPORT DATA sends results to a host file in field mode. (Results are in record mode.)
3. True or False. INDICDATA is used to preserve nulls.
4. True or False. With BTEQ, you can use conditional logic to bypass statements based on a test of an
error code.
5. True or False. It is useful to employ multiple sessions when ALL AMPS will be used for the
transaction. (It is useful when fewer than all AMPs are used.)
6. True or False. With .EXPORT, you can have output converted to a format that can be used with PC
programs.
Lab Exercises
Lab Exercise 2-1
Purpose
In this lab, you will use BTEQ to perform imports with different numbers of sessions. You will move
selected rows from the AU.Accounts table to your personal Accounts table and from a data file to your
table. You will repeat tasks using different numbers of sessions.
What you need
Populated AU.Accounts table and your empty Accounts table
Tasks
1. INSERT/SELECT all rows from the populated AU.Accounts table to your own userid.Accounts table.
Note the timing and verify that you have the correct number of rows.
Time: Number of rows:
2. Export 1000 rows to a data file (data2_1).
3. Delete all rows from your userid.Accounts table.
4. Import the rows from your data set (data2_1) to your empty userid.Accounts table. Note the time
and verify the number of rows.
Time: Number of rows:
5. Delete all the rows from your userid.Accounts table again.
Lab Exercises (cont.)
Lab Exercise
Tasks
6. Specify 8 sessions and import the rows from your data set to your empty userid.Accounts table.
Note the time and verify the number of rows.
Time: Number of rows:
7. Delete all the rows from your userid.Accounts table again.
8. Specify 210 sessions and import the rows from your data set to your empty userid.Accounts table.
Note the timing and verify the number of rows.
Time: Number of rows:
9. What are your conclusions based on the tasks you have just performed?
_______________________________________________________________________________________
_______________________________________________________________________________________
Lab Exercises (cont.)
Lab Exercise
Purpose
In this exercise, you will use BTEQ to select 500 rows from the AU.Customer table, representing a
specific set of 500 customers. First, you will use .EXPORT DATA to build a data set that contains 500
customer numbers and use this as input to access the Customer table. and use .EXPORT REPORT to
generate a report file.
What you need
Populated AU.Customer table
Tasks
1. From the AU.Customer table, export to a data file (data2_2a) the 500 customer numbers for the
customers that have the highest Social Security numbers. (Hint: You will need to use descending
order for Social Security numbers.)
2. Using the 500 customer numbers (in data2_2a) to select the 500 appropriate rows from AU.Customer,
export a report file named data2_2b. In your report you will need the fields: Customer_Number,
Last_Name, First_Name, Social_Security.
Hint: You will .IMPORT DATA from data2_2a and use .EXPORT REPORT to data2_2b.
3. View your report. The completed report should look like this:
Customer_Number Last_Name First_Name Social_Security
2001 Smith John 123456789
4. What are highest and lowest Social Security numbers in your report?
Highest: ________________________ Lowest: ________________________
Lab Solutions for Lab
Lab Exercise
cat lab216.btq
.SESSIONS 8
.LOGON u4455/tljc30,tljc30
.IMPORT DATA FILE = data2_1
.QUIET ON
.REPEAT *
USING in_account_number (INTEGER),
in_number (INTEGER),
in_street (CHAR(25)),
in_city (CHAR(20)),
in_state (CHAR(2)),
in_zip_code (INTEGER),
in_balance_forward (DECIMAL(10,2)),
in_balance_current (DECIMAL(10,2))
INSERT INTO Accounts
VALUES
(:in_account_number , :in_number , :in_street , :in_city , :in_state ,
:in_zip_code , :in_balance_forward, :in_balance_current);
.QUIT
bteq < lab216.btq
Lab Solutions for Lab
Lab Exercise 2-2
cat lab222.btq
.LOGON u4455/tljc20,tljc20
.IMPORT DATA FILE = data2_2a
.EXPORT REPORT FILE = data2_2b
.WIDTH 80
.REPEAT *
USING in_customer_number (INTEGER)
SELECT customer_number, last_name (CHAR(10)), first_name (CHAR(10)), social_security
FROM AU.Customer
WHERE customer_number = :in_customer_number;
.EXPORT RESET
.QUIT
more data2_2b
CUSTOMER_NUMBER LAST_NAME FIRST_NAME SOCIAL_SECURITY
-------------------------------- ----------------- ------------------- ----------------------------
9000 Underwood Anne 213633756
: : : :
CUSTOMER_NUMBER LAST_NAME FIRST_NAME SOCIAL_SECURITY
-------------------------------- ----------------- ------------------- ----------------------------
8501 Atchison Jose 213631261

You might also like