This tutorial describes how to transfer data to an external file, change the data in the table, and then use the transfer command again to repopulate the table from this external file, showing that transfer appends the data to the file, and doesn’t overwrite it.
Although this example transfers data out of, and back into, the same table, in a typical user scenario, data would be transferred out of one table and then into a different one.
Create the spt_TableTransfer table, which stores transfer history:
sp_setup_table_transfer
Configure max transfer history. The default is 10, which means that Adaptive Server retains 10 successful and 10 unsuccessful transfers for each table marked for incremental transfer. This example changes the value of max transfer history from 10 to 5:
sp_configure 'max transfer history', 5
Parameter Name Default Memory Used Config Value Run Value Unit Type Instance Name ------------------------- ------- ----------- -------------- -------------------- ------- --------------- --------------- max transfer history 10 0 5 5 bytes dynamic NULL
Create the transfer_example table, which has the transfer attribute enabled and uses datarow locking:
create table transfer_example ( f1 int, f2 varchar(30), f3 bigdatetime, primary key (f1) ) lock datarows with transfer table on
Populate the transfer_example table with sample data:
set nocount on declare @i int, @vc varchar(1024), @bdt bigdatetime select @i = 1 while @i <= 10 begin select @vc = replicate(char(64 + @i), 3 * @i) select @bdt = current_bigdatetime() insert into transfer_example values ( @i, @vc, @bdt ) select @i = @i + 1 end set nocount off
The script produces this data:
select * from transfer_example order by f1
f1 f2 f3 --------- ------------------------------ ---------------------------- 1 AAA Jul 17 2009 4:40:14.465789PM 2 BBBBBB Jul 17 2009 4:40:14.488003PM 3 CCCCCCCCC Jul 17 2009 4:40:14.511749PM 4 DDDDDDDDDDDD Jul 17 2009 4:40:14.536653PM 5 EEEEEEEEEEEEEEE Jul 17 2009 4:40:14.559480PM 6 FFFFFFFFFFFFFFFFFF Jul 17 2009 4:40:14.583400PM 7 GGGGGGGGGGGGGGGGGGGGG Jul 17 2009 4:40:14.607196PM 8 HHHHHHHHHHHHHHHHHHHHHHHH Jul 17 2009 4:40:14.632152PM 9 IIIIIIIIIIIIIIIIIIIIIIIIIII Jul 17 2009 4:40:14.655184PM 10 JJJJJJJJJJJJJJJJJJJJJJJJJJJJJJ Jul 17 2009 4:40:14.678938PM
Transfer the transfer_example data to an external file using the for ase format:
transfer table transfer_example to 'transfer_example-data.ase' for ase
(10 rows affected)
The data transfer creates this history record in spt_TableTransfer:
select id, sequence_id, end_code, ts_floor, ts_ceiling, row_count from spt_TableTransfer where id = object_id('transfer_example')
id sequence_id end_code ts_floor ts_ceiling row_count --------- ----------- -------- ------------- ------------------- 592002109 1 0 0 5309 10
Disable the transfer attribute from transfer_example to demonstrate that the receiving table does not need the transfer attribute enabled to receive incremental data (the database must have select into enabled before you can run alter table):
alter table transfer_example set transfer table off
After the alter table command runs, spt_TableTransfer is empty:
select id, sequence_id, end_code, ts_floor, ts_ceiling, row_count from spt_TableTransfer where id = object_id('transfer_example')
id sequence_id end_code ts_floor ts_ceiling row_count --------- ----------- -------- ------------- ------------------- (0 rows affected
Update transfer_example to set its character data to no data and to specify a date and time in its bigdatetime column so you can verify the table does not contain the original data.:
update transfer_example set f2 = 'no data', f3 = 'Jan 1, 1900 12:00:00.000001AM'
(10 rows affected)
After the update, transfer_example contains this data.
select * from transfer_example order by f1
f1 f2 f3 ----------- ------------------------------ --------------------------- 1 no data Jan 1 1900 12:00:00.000001AM 2 no data Jan 1 1900 12:00:00.000001AM 3 no data Jan 1 1900 12:00:00.000001AM 4 no data Jan 1 1900 12:00:00.000001AM 5 no data Jan 1 1900 12:00:00.000001AM 6 no data Jan 1 1900 12:00:00.000001AM 7 no data Jan 1 1900 12:00:00.000001AM 8 no data Jan 1 1900 12:00:00.000001AM 9 no data Jan 1 1900 12:00:00.000001AM 10 no data Jan 1 1900 12:00:00.000001AM (10 rows affected)
Transfer the example data from the external file
into transfer_example. Even though transfer_example is
no longer marked for incremental transfer, you can transfer data
in to the table. Because it has a unique primary index, the incoming
rows replace the existing data and do not create duplicate
key
errors:
transfer table transfer_example from 'transfer_example-data.ase' for ase
(10 rows affected)
Select all data from transfer_example to verify that the incoming data replaced the changed data. The transfer replaced the contents of transfer_example.f2 and transfer_example.f3 tables with the data originally created for them, which were stored in the transfer_example-data.ase output file.
select * from transfer_example order by f1
f1 f2 f3 ------- ------------------------------ ---------------------------- 1 AAA Jul 17 2009 4:40:14.465789PM 2 BBBBBB Jul 17 2009 4:40:14.488003PM 3 CCCCCCCCC Jul 17 2009 4:40:14.511749PM 4 DDDDDDDDDDDD Jul 17 2009 4:40:14.536653PM 5 EEEEEEEEEEEEEEE Jul 17 2009 4:40:14.559480PM 6 FFFFFFFFFFFFFFFFFF Jul 17 2009 4:40:14.583400PM 7 GGGGGGGGGGGGGGGGGGGGG Jul 17 2009 4:40:14.607196PM 8 HHHHHHHHHHHHHHHHHHHHHHHH Jul 17 2009 4:40:14.632152PM 9 IIIIIIIIIIIIIIIIIIIIIIIIIII Jul 17 2009 4:40:14.655184PM 10 JJJJJJJJJJJJJJJJJJJJJJJJJJJJJJ Jul 17 2009 4:40:14.678938PM
Reenable transfer for transfer_example so subsequent transfers use, by default, the previous parameters:
alter table transfer_example set transfer table on
(10 rows affected)