Imports bulk data into a database table from an external file.
LOAD [ INTO ] TABLE [ owner.]table-name [ ( column-name, ... ) ] load-source [ load-option ... ] [ statistics-limitation-option ]
load-source : { FROM filename-expression | USING FILE filename-expression | USING CLIENT FILE client-filename-expression | USING VALUE value-expression | USING COLUMN column-expression }
filename-expression : string | variable
client-filename-expression : string | variable
value-expression : expression
column-expression : column-name FROM table-name ORDER BY column-list
load-option : BYTE ORDER MARK { ON | OFF } | CHECK CONSTRAINTS { ON | OFF } | { COMPRESSED | AUTO COMPRESSED | NOT COMPRESSED } | { ENCRYPTED KEY 'key' | NOT ENCRYPTED } | COMMENTS INTRODUCED BY comment-prefix | COMPUTES { ON | OFF } | DEFAULTS { ON | OFF } | DELIMITED BY string | ENCODING encoding | ESCAPE CHARACTER character | ESCAPES { ON | OFF } | FORMAT { TEXT | BCP | XML row-xpath ( column-xpath,... ) [ NAMESPACES namespace ] } | SHAPEFILE | HEXADECIMAL { ON | OFF } | ORDER { ON | OFF } | PCTFREE percent-free-space | QUOTE string | QUOTES { ON | OFF } | ROW DELIMITED BY string | SKIP integer | STRIP { ON | OFF | LTRIM | RTRIM | BOTH } | WITH CHECKPOINT { ON | OFF } | WITH { FILE NAME | ROW | CONTENT } LOGGING
statistics-limitation-option : STATISTICS { ON [ ALL COLUMNS ] | ON KEY COLUMNS | ON ( column-list ) | OFF }
comment-prefix : string
encoding : string
column-name Use this clause to specify one or more columns to load data into. Any columns not present in the column list become NULL if DEFAULTS is OFF. If DEFAULTS is ON and the column has a default value, that value is used. If DEFAULTS is OFF and a non-nullable column is omitted from the column list, the database server attempts to convert the empty string to the column's type.
When a column list is specified, it lists the columns that are expected to exist in the file and the order in which they are expected to appear. Column names cannot be repeated. Columns whose names do not appear in the list are set to NULL/zero/empty or DEFAULT (depending on column nullability, data type, and the DEFAULTS setting). Columns that exist in the input file that are to be ignored by LOAD TABLE can be specified using filler() as a column name.
load-source Use this clause to specify the data source to load data from. There are several sources of data from which data can be loaded. The following list gives the supported load sources:
FROM clause Use this to specify a file. The filename-expression is passed to the database server as a string. The string is therefore subject to the same database formatting requirements as other SQL strings. In particular:
To indicate directory paths, the backslash character (\) must be represented by two backslashes. The statement to load data from the file c:\temp\input.dat into the Employees table is:
LOAD TABLE Employees FROM 'c:\\temp\\input.dat' ... |
The path name is relative to the database server, not to the client application.
You can use UNC path names to load data from files on computers other than the database server.
USING FILE clause Use this clause to load data from a file. This is synonymous with specifying the FROM filename clause.
When the LOAD TABLE statement is used with the USING FILE clause, you can request progress messages.
You can also use the Progress connection property to determine how much of the statement has been executed.
USING CLIENT FILE clause Use this clause to load data from a file on a client computer. When the database server retrieves data from client-filename-expression, the data is not materialized in the server's memory, so the database server limit on the size of BLOB expressions does not apply to the file. Therefore, the client file can be of an arbitrary size.
File name logging is not allowed if the table is being loaded from a client file. If the logging type is not specified, then WITH CONTENT LOGGING is used.
When the LOAD TABLE statement is used with the USING CLIENT FILE clause, you can request progress messages.
You can also use the Progress connection property to determine how much of the statement has been executed.
USING VALUE clause Use this clause to load data from any expression of CHAR, NCHAR, BINARY, or LONG BINARY type, or BLOB string. The following are examples of how this clause can be used:
The following syntax uses the xp_read_file system procedure to get the values to load from the target file:
... USING VALUE xp_read_file( 'filename' )... |
The following syntax specifies the value directly, inserting two rows with values of 4 and 5, respectively;
... USING VALUE '4\n5'... |
The following syntax uses the results of the READ_CLIENT_FILE function as the value:
... USING VALUE READ_CLIENT_FILE( client-filename-expression ) |
In this case, you can also specify USING CLIENT FILE client-filename-expression
since they are semantically equivalent.
If the ENCODING clause is not specified in the LOAD TABLE statement, then encoding for value-expression is assumed to be in the database character set (db_charset) if value-expression is of type CHAR or BINARY, and NCHAR database character set (nchar_charset) if value-expression is of type NCHAR.
USING COLUMN clause Use this clause to load data from a single column in another table. This clause is used by the database server when it replays the transaction log during recovery by replaying the LOAD TABLE...WITH CONTENT LOGGING statements. Transaction log records for LOAD TABLE...WITH CONTENT LOGGING statements comprise chunks of the original input file. When the database server encounters these chunks in the transaction log during recovery, it loads the chunks into a temporary table and then loads all the data from the original load operation.
The following clauses are required in the USING COLUMN clause:
table-name The name of the base or temporary table that contains the column to load data from. When used by the database server during recovery from the transaction log, this is the table that holds the chunks of rows to be parsed and loaded.
column-name The name of the column in table-name that holds the chunks of rows to be loaded.
column-list One or more columns in the destination table used to sort the rows before loading the data. column-list must be a verifiable unique set of values, such as a primary key or a unique index on non-nullable columns included within the column list.
load-option clause There are several load options you can specify to control how data is loaded. The following list gives the supported load options:
BYTE ORDER MARK clause Use this clause to specify whether the server should search for and interpret a byte order mark (BOM) at the beginning of the data. By default, this option is ON. If BYTE ORDER MARK is OFF, the server does not search for a BOM.
If the ENCODING clause is specified:
If the ENCODING clause is not specified:
CHECK CONSTRAINTS clause Use this clause to control whether constraints are checked during loading. CHECK CONSTRAINTS is ON by default, but the Unload utility (dbunload) writes out LOAD TABLE statements with CHECK CONSTRAINTS set to OFF. Setting CHECK CONSTRAINTS to OFF disables check constraints, which can be useful, for example, during database rebuilding. If a table has check constraints that call user-defined functions that are not yet created, the rebuild fails unless CHECK CONSTRAINTS is set to OFF.
COMMENTS INTRODUCED BY clause Use this clause to specify the string used in the data file to introduce a comment. When used, LOAD TABLE ignores any line that begins with the string comment-prefix. For example, in the following statement, lines in input.dat that start with // are ignored.
LOAD TABLE Employees FROM 'c:\\temp\\input.dat' COMMENTS INTRODUCED BY '//' ... |
Comments are only allowed at the beginning of a new line.
If COMMENTS INTRODUCED BY is omitted, the data file must not contain comments because they are interpreted as data.
COMPRESSED clause Specify COMPRESSED if the data being loaded is compressed in the input file. The database server decompresses the data before loading it. If you specify COMPRESSED and the data is not compressed, the LOAD fails and returns an error.
Specify AUTO COMPRESSED to allow the database server determine whether the data in the input file is compressed. If so, the database server decompresses the data before loading it.
Specify NOT COMPRESSED to indicate that the data in the input file is not compressed. You can also specify NOT COMPRESSED if the data is compressed, but you don't want the database server to decompress it. In this case, the data remains compressed in the database. However, if a file is both encrypted and compressed, you cannot use NOT ENCRYPTED without also using NOT COMPRESSED.
COMPUTES clause By default, this option is ON, which enables recalculation of computed columns. Setting COMPUTES to OFF disables computed column recalculations. COMPUTES OFF is useful, for example, if you are rebuilding a database, and a table has a computed column that calls a user-defined function that is not yet created. The rebuild would fail unless this option was set to OFF.
The Unload utility (dbunload) writes out LOAD TABLE statements with the COMPUTES set to OFF.
DEFAULTS clause By default, DEFAULTS is set to OFF. If DEFAULTS is OFF, any column not present in the list of columns is assigned NULL. If DEFAULTS is set to OFF and a non-nullable column is omitted from the list, the database server attempts to convert the empty string to the column's type. If DEFAULTS is set to ON and the column has a default value, that value is used.
DELIMITED BY clause
Use this clause to specify the column delimiter string. The default column delimiter string is a comma; however, it
can be any string up to 255 bytes in length (for example, ... DELIMITED BY '###' ...
). The delimiter you specify is a string and should be quoted. If you want to specify tab-delimited values, you could specify
the hexadecimal escape sequence for the tab character (9), ... DELIMITED BY '\x09' ...
.
ENCODING clause Use this clause to specify the character encoding used for the data being loaded into the database. The ENCODING clause cannot be used with the BCP format.
If a translation error occurs during the load operation, it is reported based on the setting of the on_charset_conversion_failure option.
Specify the BYTE ORDER clause to interpret a byte order mark in the data.
If the ENCODING clause is specified:
If the ENCODING clause is not specified:
ENCRYPTED clause Use this clause to specify encryption settings. When loading encrypted data, specify ENCRYPTED KEY followed by the key used to encrypt the data in the input file.
Specify NOT ENCRYPTED to indicate that the data in the input file is not encrypted. You can also specify NOT ENCRYPTED if the data is encrypted, but you don't want the database server to decrypt it. In this case, the data remains encrypted in the database. However, if a file is both encrypted and compressed, you cannot use NOT ENCRYPTED without also using NOT COMPRESSED.
ESCAPE CHARACTER clause Use this clause to specify the escape character used in the data. The default escape character for characters stored as hexadecimal codes and symbols is a backslash (\), so \x0A is the linefeed character, for example. This can be changed using the ESCAPE CHARACTER clause. For example, to use the exclamation mark as the escape character, you would enter:
ESCAPE CHARACTER '!' |
It is recommended that the string you specify for the escape character is no longer than one multibyte character.
ESCAPES clause Use this clause to control whether to recognize escape characters. With ESCAPES turned ON (the default), characters following the escape character (which defaults to \) are recognized and interpreted as special characters by the database server. Newline characters can be included as the combination \n, and other characters can be included in data as hexadecimal ASCII codes, such as \x09 for the tab character. A sequence of two backslash characters ( \\ ) is interpreted as a single backslash. A backslash followed by any character other than n, x, X, or \ is interpreted as two separate characters. For example, \q inserts a backslash and the letter q. It is recommended that the string you specify for the escape character is no longer than one multibyte character.
FORMAT TEXT If you choose FORMAT TEXT (the default), input lines are assumed to be characters (as defined by the ENCODING option), one row per line, with values separated by the column delimiter string.
FORMAT BCP Specify FORMAT BCP to load Adaptive Server Enterprise-generated BCP out files.
FORMAT SHAPEFILE Specify FORMAT SHAPEFILE to load ESRI shapefiles. The shapefile must be on the database server computer and must be loaded using FROM filename-expression or USING FILE filename-expression, where filename-expression refers to an ESRI shapefile with the .shp file extension. The associated .shx and .dbf files must be located in the same directory as the .shp file, and have the same base file name.
For FORMAT SHAPEFILE, the encoding defaults to ISO-8859-1 if the ENCODING clause is not specified.
If you specify FORMAT SHAPEFILE, the following load options are allowed:
The LOAD TABLE statement gets the SRID from the second column type that you are loading into. For example, if you created the second column with type ST_Geometry(SRID=4326), then the geometries are loaded using SRID 4326. If your second column has type ST_Geometry (no explicit SRID), then the geometries are loaded using SRID 0.
FORMAT XML If you specify FORMAT XML, the following load options are allowed:
If you use FORMAT XML, the input file is parsed in the same way as a query that uses the openxml system procedure. The arguments of the SQL statement correspond to the system procedure parameters as follows:
LOAD TABLE statement clause | openxml system procedure argument | Details |
---|---|---|
row-xpath | xpath | |
— | flags | There is no way to specify a value with FORMAT XML that corresponds to the flags argument of openxml. |
NAMESPACES | namespaces |
The FORMAT XML clause uses the following parameters:
row-xpath A string or variable containing an XPath query. XPath allows you to specify patterns that describe the structure of the XML document you are querying. The XPath pattern included in this argument selects the nodes from the XML document. Each node that matches the XPath query in the row-xpath argument generates one row in the table.
Metaproperties can only be specified in FORMAT XML clause row-xpath arguments. A metaproperty is accessed within an XPath query as if it was an attribute. If namespaces is not specified, then by default the prefix mp is bound to the Uniform Resource Identifier (URI) urn:ianywhere-com:sa-xpath-metaprop. If namespace is specified, this URI must be bound to mp or some other prefix to access metaproperties in the query. Metaproperty names are case sensitive. The following metaproperties are supported:
@mp:id returns an ID for a node that is unique within the XML document. The ID for a given node in a given document may change if the database server is restarted. The value of this metaproperty increases with document order.
@mp:localname returns the local part of the node name, or NULL if the node does not have a name.
@mp:prefix returns the prefix part of the node name, or NULL if the node does not have a name or if the name is not prefixed.
@mp:namespaceuri returns the URI of the namespace that the node belongs to, or NULL if the node is not in a namespace.
@mp:xmltext returns a subtree of the XML document in XML form. For example, when you match an internal node, you can use this metaproperty to return an XML string, rather than the concatenated values of the descendant text nodes.
column-xpath A string or variable that specifies the schema of the result set and how the value is found for each column in the result set. If a FORMAT XML clause expression matches more than one node, then only the first node in the document order is used. If the node is not a text node, then the result is found by appending all the text node descendants. If a FORMAT XML clause expression does not match any nodes, then the column for that row is NULL.
namespace A string or variable containing an XML document. The in-scope namespaces for the query are taken from the root element of the document.
HEXADECIMAL clause Use this clause to specify whether to read binary values as hexadecimals. By default, HEXADECIMAL is ON. With HEXADECIMAL ON, binary column values are read as 0xnnnnnn..., where 0x is a zero followed by an x, and each n is a hexadecimal digit. It is important to use HEXADECIMAL ON when dealing with multibyte character sets.
The HEXADECIMAL clause can be used only with the FORMAT TEXT clause.
ORDER clause Use this clause to specify whether to sort the data when loading. The default for ORDER is ON. If ORDER is ON, and a clustered index has been declared, then LOAD TABLE sorts the input data according to the clustered index and inserts rows in the same order. If the data you are loading is already sorted, you should set ORDER to OFF.
PCTFREE clause Use this clause to specify the percentage of free space you want to reserve for each table page. This setting overrides any permanent setting for the table, but only for the duration of the load, and only for the data being loaded. The value percent-free-space is an integer between 0 and 100. A value of 0 specifies that no free space is to be left on each page—each page is to be fully packed. A high value causes each row to be inserted into a page by itself.
QUOTE clause The QUOTE clause is for TEXT data only; the string is placed around string values. The default is a single quote (apostrophe).
QUOTES clause Use this clause to specify whether strings are enclosed in quotes. When QUOTES is set to ON (the default), the LOAD TABLE statement expects strings to be enclosed in quote characters. If the QUOTES clause is omitted, the quote character is either an apostrophe (single quote) or a quotation mark (double quote) and the first such character encountered in a string is treated as the quote character for the string. Strings must be terminated by a matching quote.
When QUOTES is set to ON, column delimiter strings can be included in column values. Also, quote characters are assumed not to be part of the value. Therefore, the following line is treated as two values, not three, despite the presence of the comma in the address. Also, the quotes surrounding the address are not inserted into the database.
'123 High Street, Anytown',(715)398-2354 |
To include a quote character in a value, when QUOTES is set to ON, you must use two quotes. The following line includes a value in the third column that is a single quote character:
'123 High Street, Anytown','(715)398-2354','''' |
ROW DELIMITED BY clause
Use this clause to specify the string that indicates the end of an input record. The default delimiter string is a
newline (\n); however, it can be any string up to 255 bytes in length (for example, ROW DELIMITED BY '###'
). If you wanted to specify tab-delimited values, you could specify the hexadecimal escape sequence for the tab character
(9), ROW DELIMITED BY '\x09'
. If your delimiter string contains a \n, it matches either \r\n or \n.
SKIP clause Use this clause to specify whether to ignore lines at the beginning of a file. The integer argument specifies the number of lines to skip. You can use this clause to skip over a line containing column headings, for example. If the row delimiter is not the default (newline), then skipping may not work correctly if the data contains the row delimiter embedded within a quoted string.
STRIP clause Use this clause to specify whether unquoted values should have leading or trailing blanks stripped off before they are inserted. The STRIP option accepts the following options:
STRIP OFF Do not strip off leading or trailing blanks.
STRIP LTRIM Strip leading blanks.
STRIP RTRIM Strip trailing blanks.
STRIP BOTH Strip both leading and trailing blanks.
STRIP ON Deprecated. Equivalent to STRIP RTRIM.
The STRIP behavior is tied to the QUOTES clause. If you specify QUOTES OFF, then STRIP OFF, STRIP LTRIM, STRIP RTRIM, and STRIP BOTH, work exactly as their wording suggests. If you do not specify a QUOTES clause, or you specify QUOTES ON, unquoted strings are always left-trimmed and right-trimmed (however, you can specify STRIP OFF or STRIP LTRIM if you don't want the strings to be right-trimmed as well).
WITH CHECKPOINT clause Use this clause to specify whether to perform a checkpoint. The default setting is OFF. If this clause is set to ON, a checkpoint is issued after successfully completing and logging the statement. If this clause is set to ON, and the database requires automatic recovery before a checkpoint is issued, the data file used to load the table must be present for the recovery to complete successfully if you use FILE NAME LOGGING. If WITH CHECKPOINT ON is specified, and recovery is subsequently required, recovery begins after the checkpoint, and the data file need not be present.
The data files are required, regardless of what is specified for this clause, if the database becomes corrupt and you need to use a backup and apply the current log file if you use FILE NAME LOGGING.
WITH { FILE NAME | ROW | CONTENT } LOGGING Use this clause to control the level of detail logged in the transaction log during a load operation. The levels of logging are as follows:
WITH FILE NAME LOGGING clause The WITH FILE NAME LOGGING clause causes only the LOAD TABLE statement to be recorded in the transaction log. To guarantee consistent results when the transaction log is used during recovery, the file used for the original load operation must be present in its original location, and must contain the original data. This level of logging has the best performance; however, you should not use it if your database is involved in mirroring or synchronization. Also, this level can not be used when loading from an expression or a client file.
When you do not specify a logging level in the LOAD TABLE statement, WITH ROW LOGGING is the default level when specifying:
WITH ROW LOGGING clause The WITH ROW LOGGING clause causes each row that is loaded to be recorded in the transaction log as an INSERT statement. This level of logging is recommended for databases involved in synchronization and is the default for database mirroring when using FROM filename-expression or USING FILE filename-expression. However, when loading large amounts of data, this logging type can impact performance, and results in a much longer transaction log.
If there are no non-deterministic value, WITH CONTENT LOGGING likely results in better performance
This level is also ideal for databases where the table being loaded into contains non-deterministic values, such as computed columns, or CURRENT TIMESTAMP defaults.
WITH CONTENT LOGGING clause The WITH CONTENT LOGGING clause causes the database server to copy the input file to the transaction log in chunks. These chunks can be reconstituted into a copy of the input file later, for example during recovery from the transaction log. When loading large amounts of data, this logging type has a very low impact on performance, and offers increased data protection, but it does result in a longer transaction log. This level of logging is recommended for databases involved in mirroring, or where it is desirable to not maintain the original data files for later recovery provided there are no non-deterministic values.
The WITH CONTENT LOGGING clause cannot be used if the database is involved in synchronization. The WITH CONTENT LOGGING clause is required if the table is being loaded from a client file.
When you do not specify a logging level in the LOAD TABLE statement, WITH CONTENT LOGGING is the default level when specifying:
USING CLIENT FILE client-filename-expression
USING VALUE value-expression
USING COLUMN column-expression
statistics-limitation-option Allows you to limit the columns for which statistics are generated during the execution of LOAD TABLE. Otherwise, statistics are generated for all columns. You should only use this clause if you are certain that statistics will not be used on some columns. You can specify ON ALL COLUMNS (the default), OFF, ON KEY COLUMNS, or a list of columns for which statistics should be generated.
LOAD TABLE allows efficient mass insertion into a database table from a file. LOAD TABLE is more efficient than the Interactive SQL statement INPUT.
LOAD TABLE places a write lock on the whole table. For base tables, global temporary tables, and local temporary tables, a commit is performed.
If you attempt to use LOAD TABLE on a table on which an immediate text index is built, or that is referenced by an immediate view, the load fails. This does not occur for non-immediate text indexes or materialized views; however, it is strongly recommended that you truncate the data in dependent indexes and materialized views before executing the LOAD TABLE statement, and then refresh the indexes and views after.
Do not use the LOAD TABLE statement on a temporary table for which ON COMMIT DELETE ROWS was specified, either explicitly or by default, at creation time. However, you can use LOAD TABLE if ON COMMIT PRESERVE ROWS or NOT TRANSACTIONAL was specified.
With FORMAT TEXT, a NULL value is indicated by specifying no value. For example, if three values are expected and the file
contains 1,,'Fred',
, then the values inserted are 1, NULL, and Fred. If the file contains 1,2,
, then the values 1, 2, and NULL are inserted. Values that consist only of spaces are also considered NULL values. For example,
if the file contains 1, ,'Fred',
, then values 1, NULL, and Fred are inserted. All other values are considered not NULL. For example, '' (a single quote followed
by single quote) is an empty string. 'NULL' is a string containing four letters.
If a column being loaded by LOAD TABLE does not allow NULL values and the file value is NULL, then numeric columns are given the value 0 (zero), character columns are given an empty string (''). If a column being loaded by LOAD TABLE allows NULL values and the file value is NULL, then the column value is NULL (for all types).
If the table contains columns a, b, and c, and the input data contains a, b, and c, but the LOAD statement only specifies only a and b as columns to load data into, the following values are inserted into column c:
if DEFAULTS ON is specified, and column c has a default value, the default value is used.
if column c does not have a default defined for it and it allows NULLs, then a NULL is used.
if column c does not have a default defined for it and it does not allow NULLs, then either a zero (0) or an empty string ('') is used, or an error is returned, depending on the data type of the column.
LOAD TABLE and database mirroring If you are using database mirroring and execute a LOAD TABLE statement on a base table, you must specify either WITH ROW LOGGING or WITH CONTENT LOGGING as the logging level for the statement. These clauses allow the loaded data to be recorded in the transaction log so that it can be loaded into the mirroring database as well. If these clauses are not specified, an error is reported.
LOAD TABLE and column statistics To create histograms on table columns, LOAD TABLE captures column statistics when it loads data. The histograms are used by the optimizer.
Following are additional tips about loading and column statistics:
LOAD TABLE saves statistics on base tables for future use. It does not save statistics on global temporary tables.
If you are loading into an empty table that may have previously contained data, it may be beneficial to drop the statistics for the column before executing the LOAD TABLE statement.
If column statistics exist when LOAD TABLE is performed on a column, statistics for the column are not recalculated. Instead, statistics for the new data are inserted into the existing statistics. This means that if the existing column statistics are out-of-date, they will still be out of date after loading new data into the column. If you suspect that the column statistics are out of date, you should consider updating them either before, or after, executing the LOAD TABLE statement.
LOAD TABLE adds statistics only if the table has five or more rows. If the table has at least five rows, histograms are modified as follows:
Data already in table? | Histogram present? | Action taken |
---|---|---|
Yes | Yes | Integrate changes into the existing histograms |
Yes | No | Do not build histograms |
No | Yes | Integrate changes into the existing histograms |
No | No | Build new histograms |
LOAD TABLE does not generate statistics for columns that contain NULL values for more than 90% of the rows being loaded.
Using dynamically constructed file names You can execute a LOAD TABLE statement with a dynamically constructed file name by assigning the file name to a variable and using the variable name in the LOAD TABLE statement.
The permissions required to execute a LOAD TABLE statement depend on the database server -gl option, as follows:
If the -gl option is set to ALL, you must be the owner of the table or have DBA authority or have ALTER privileges.
If the -gl option is set to DBA, you must have DBA authority.
If the -gl option is set to NONE, LOAD TABLE is not permitted.
Requires an exclusive lock on the table.
When reading from a file on a client computer:
READCLIENTFILE authority is required.
Read permissions are required on the directory being read from.
The allow_read_client_file database option must be enabled.
The read_client_file secured feature must be enabled.
Automatic commit.
Inserts are not recorded in the transaction log file unless WITH ROW LOGGING clause is specified, so the inserted rows may not be recovered in the event of a failure depending upon the logging type. The original file is required if you need to recover the rows and WITH FILE NAME LOGGING is used. In addition, the LOAD TABLE statement should not be used without the WITH ROW LOGGING clause in databases that are used as MobiLink clients, or in a database involved in SQL Remote replication, because these technologies replicate changes through analysis of the log file.
The LOAD TABLE statement does not fire any triggers associated with the table.
A checkpoint is carried out at the beginning of the operation. A second checkpoint is performed at the end if WITH CHECKPOINT ON is specified.
Column statistics are updated if a significant amount of data is loaded.
SQL/2008 Vendor extension.
Following is an example of LOAD TABLE. First, you create a table, and then load data into it using a file called input.txt.
CREATE TABLE t( a CHAR(100), let_me_default INT DEFAULT 1, c CHAR(100) ); |
Following is the content of a file called input.txt:
ignore_me, this_is_for_column_c, this_is_for_column_a |
The following LOAD statement loads the file called input.txt:
LOAD TABLE T ( filler(), c, a ) FROM 'input.txt' FORMAT TEXT DEFAULTS ON; |
The command SELECT * FROM t
yields the result set:
a | let_me_default | c |
---|---|---|
this_is_for_column_a | 1 | this_is_for_column_c |
The following example executes the LOAD TABLE statement with a dynamically constructed file name, via the EXECUTE IMMEDIATE statement:
CREATE PROCEDURE LoadData( IN from_file LONG VARCHAR ) BEGIN DECLARE path LONG VARCHAR; SET path = 'd:\\data\\' || from_file; LOAD TABLE MyTable FROM path; END; |
The following example loads UTF-8-encoded table data into mytable:
LOAD TABLE mytable FROM 'mytable_data_in_utf8.dat' ENCODING 'UTF-8'; |
Discuss this page in DocCommentXchange.
|
Copyright © 2012, iAnywhere Solutions, Inc. - SQL Anywhere 12.0.1 |