SQL*Loader is a bulk loader utility used for moving data from external files into the Oracle database.
SQL*Loader (sqlldr) is the utility to use for high performance data loads. The data can be loaded from any text file and inserted into the database.
Recently I got the chance to use SQL*Loader to automate data uploading from a text file in which each field delimited with the pipe "|" into the Oracle.
1. Create a control file, call it anything in my case its add.ctl
load data infile '-'
APPEND
into table records
fields terminated by '|'
(
field1,
field2,
.
.
fieldN
)
load data infile '-' : In my case there are multiple text files which to be process by sqlldr so instead of using filename I used the '-'. And used APPEND to append the new data into the table.
2. Script to run to call sqlldr utility for uploading the data from text files.
#!/bin/sh
# Author: Askar Ali Khan
# Purpose: To upload data from text files into Oracle Table - RECORDS
# records_upload
# Define Oracle variables
ORACLE_BASE=/oracle
ORACLE_HOME=$ORACLE_BASE/102
ORACLE_SID=SID
LD_LIBRARY_PATH=$ORACLE_HOME/lib
PATH=$PATH:$ORACLE_HOME/bin
export ORACLE_BASE ORACLE_HOME ORACLE_SID LD_LIBRARY_PATH PATH
SRCDIR="/data"
# Must be run as oracle user
if [[ $UID -ne 501 ]]; then
echo "$0 must be run as oracle"
exit 1
fi
cd "$SRCDIR"
# If directory contains *.txt files then call sqlldr otherwise do nothing
for file in *.txt; do
test -f "$file" || continue
cat $file | sqlldr oraclUser/Password add.ctl
rm -f $file
done
The script process files in /data directory one by one by piping into sqlldr and delete it. Deletion is must coz in the next run there will be new files fetched into /data directory by another FTP script.
3. Define a cron job under oracle user to run script every 15 minutes
$ crontab -e
*/15 * * * * /usr/local/bin/records_upload > /dev/null 2>&1
SQL*Loader (sqlldr) is the utility to use for high performance data loads. The data can be loaded from any text file and inserted into the database.
Recently I got the chance to use SQL*Loader to automate data uploading from a text file in which each field delimited with the pipe "|" into the Oracle.
1. Create a control file, call it anything in my case its add.ctl
load data infile '-'
APPEND
into table records
fields terminated by '|'
(
field1,
field2,
.
.
fieldN
)
load data infile '-' : In my case there are multiple text files which to be process by sqlldr so instead of using filename I used the '-'. And used APPEND to append the new data into the table.
2. Script to run to call sqlldr utility for uploading the data from text files.
#!/bin/sh
# Author: Askar Ali Khan
# Purpose: To upload data from text files into Oracle Table - RECORDS
# records_upload
# Define Oracle variables
ORACLE_BASE=/oracle
ORACLE_HOME=$ORACLE_BASE/102
ORACLE_SID=SID
LD_LIBRARY_PATH=$ORACLE_HOME/lib
PATH=$PATH:$ORACLE_HOME/bin
export ORACLE_BASE ORACLE_HOME ORACLE_SID LD_LIBRARY_PATH PATH
SRCDIR="/data"
# Must be run as oracle user
if [[ $UID -ne 501 ]]; then
echo "$0 must be run as oracle"
exit 1
fi
cd "$SRCDIR"
# If directory contains *.txt files then call sqlldr otherwise do nothing
for file in *.txt; do
test -f "$file" || continue
cat $file | sqlldr oraclUser/Password add.ctl
rm -f $file
done
The script process files in /data directory one by one by piping into sqlldr and delete it. Deletion is must coz in the next run there will be new files fetched into /data directory by another FTP script.
3. Define a cron job under oracle user to run script every 15 minutes
$ crontab -e
*/15 * * * * /usr/local/bin/records_upload > /dev/null 2>&1
any equivalent code for windows :(
ReplyDeleteHi Nadeem
DeleteSorry never worked on Windows scripting but you will find example online how to bulk upload using windows
Get 1 Billion worldwide data for bulk marketing your business...
ReplyDeletehttp://100milliondata.blogspot.com