Bulk Loads with the LOAD TABLE Statement

The LOAD TABLE statement efficiently imports data from a text or binary file into an existing database table.

The statement loads data into column indexes created automatically or defined by users.

The permissions needed to execute a LOAD TABLE statement are set on the server command line, using the -gl option. Sybase recommends the -gl all setting, which allows any user to load or unload a table. This is the default setting set by start_iq. If -gl all is set, you must be the owner of the table, have DBA authority, or have ALTER permission, in order to use the LOAD TABLE statement. You also need INSERT permission on the table.

Using Command Files to Load Data

To load large amounts of data, most users create command files.

Transaction Processing and LOAD TABLE

When you issue the LOAD TABLE statement for an IQ table, a savepoint occurs automatically before the data is loaded.

If the load completes successfully, Sybase IQ releases the savepoint. If the load fails, the transaction rolls back to the savepoint. This approach gives you flexibility in committing transactions. For example, if you issue two LOAD TABLE commands, you can ensure that either both commands commit or neither commits.

When you issue LOAD TABLE for a catalog store table, there is no automatic savepoint. If the load succeeds, it commits automatically. If the load fails, it rolls back. You cannot roll back a successful load of a catalog store table.

Load from a Flat File: UNIX Example

The following statement loads the data from the file dept.txt into all columns of the department table.

This example assumes that no explicit data conversion is needed, and that the width of input columns matches the width of columns in the Departments table. The flat file dept.txt must exist at the specified location.

LOAD TABLE Departments
( DepartmentID, DepartmentName, DepartmentHeadID )
FROM '/d1/MILL1/dept.txt'

File Specification Requirements for Loads

In the FROM clause, use filename-string to specify files, and separate multiple strings by commas.

The files are read one at a time, and processed in a left-to-right order as specified in the FROM clause. Any SKIP or LIMIT value only applies in the beginning of the load, not for each file.

If a load cannot complete, for example due to insufficient memory, the entire load transaction rolls back.

The filename-string is passed to the server as a string. The string is therefore subject to the same formatting requirements as other SQL strings. In particular:

  • If a backslash (\) precedes the characters n, x, or \ it is considered an escape character. For this reason, to indicate directory paths in Windows systems, you must represent the backslash character by two backslashes if the next character is any of those listed. (It is always safe to double the backslashes.)

    To load data from the file c:\newinput.dat into the employee table:
    LOAD TABLE employees
    FROM 'c:\\newinput.dat' ...
  • For server-side loading (LOAD TABLE ... USING FILE), the path name is relative to the database server, not to the client application. If you are running the statement on a database server on some other computer, the directory name refers to directories on the server machine, not on the client machine. The input file for the load must be on the server machine.

  • For client-side data loading (LOAD TABLE ... USING CLIENT FILE), the path name must be relative to the client application. The directory name refers to directories on the client machine.

Loads that Specify Named Pipes

When you load from a named pipe (or FIFO) on Windows, the program writing to the pipe must close the pipe in a special way.

The pipe writer must call FlushFileBuffers( ) and then DisconnectNamedPipe( ). (If the program does not, Sybase IQ reports an exception from hos_io::Read( ).) This issues a PIPE_NOT_CONNECTED error, which notifies Sybase IQ that the pipe was shut down in an orderly manner rather than an uncontrolled disconnect. See Microsoft documentation for details on these calls.

Related concepts
Inserting Data Directly from Adaptive Server Enterprise
Load Time Environment Adjustments
Tools for Investigating Lock Contention