This example shows you how to set up a scripted upload that provides conflict detection. The example creates the consolidated and remote databases, stored procedures, publications and subscriptions that are required by scripted upload. This example is presented in such a way that you can either just read through it, or you can cut and paste the text to run the sample.
Create a directory to hold the sample files. For example, call it scriptedupload. Open a command prompt and navigate to that directory.
(This example specifies file names and assume they are in the current directory. In a real application, you should specify the full path to the file.)
Run the following command to create a consolidated database:
dbinit consol.db |
Next, run the following command to define an ODBC data source for the consolidated database:
dbdsn -w dsn_consol -y -c "uid=DBA;pwd=sql;dbf=consol.db;server=consol" |
To use a database as a consolidated database, you must run a setup script that adds system tables, views, and stored procedures that are used by MobiLink. The following command sets up consol.db as a consolidated database:
dbisql -c "dsn=dsn_consol" %SQLANY12%\MobiLink\setup\syncsa.sql |
Open Interactive SQL and connect to consol.db using the dsn_consol DSN. Run the following SQL statements. They create the employee table on the consolidated database, insert values into the table, and create the required synchronization scripts.
CREATE TABLE employee ( id unsigned integer primary key, name varchar( 256), salary numeric( 9, 2 ) ); INSERT INTO employee VALUES( 100, 'smith', 225000 ); COMMIT; CALL ml_add_table_script( 'default', 'employee', 'upload_insert', 'INSERT INTO employee ( id, name, salary ) VALUES ( {ml r.id}, {ml r.name}, {ml r.salary} )' ); CALL ml_add_table_script( 'default', 'employee', 'upload_update', 'UPDATE employee SET name = {ml r.name}, salary = {ml r.salary} WHERE id = {ml r.id}' ); CALL ml_add_table_script( 'default', 'employee', 'upload_delete', 'DELETE FROM employee WHERE id = {ml r.id}' ); CALL ml_add_table_script( 'default', 'employee', 'download_cursor', 'SELECT * from employee' ); |
At a command prompt in your samples directory, run the following command to create a remote database:
dbinit remote.db |
Next, run the following command to define an ODBC data source:
dbdsn -w dsn_remote -y -c "uid=dba;pwd=sql;dbf=remote.db;server=remote" |
In Interactive SQL, connect to remote.db using the dsn_remote DSN. Run the following set of statements to create objects in the remote database.
First, create the table to be synchronized. The insert_time and delete_time columns are not synchronized but contain information used by the upload stored procedures to determine which rows to upload.
CREATE TABLE employee ( id unsigned integer primary key, name varchar( 256), salary numeric( 9, 2 ), insert_time timestamp default '1900-01-01' ); |
Next, you need to define stored procedures and other things to handle the upload. You do this separately for inserts, deletes, and updates.
First, create a trigger to set the insert_time on each row when it is inserted. This timestamp is used to determine if a row has been inserted since the last synchronization. This trigger is not fired when dbmlsync is applying downloaded inserts from the consolidated database because later in this example you set the FireTriggers extended option to off. Rows inserted by the download get an insert_time of 1900-01-01, the default value defined when the employee table was created. This value should always be before the start progress so those rows are not treated as new inserts and are not uploaded during the next synchronization.
CREATE TRIGGER emp_ins AFTER INSERT ON employee REFERENCING NEW AS newrow FOR EACH ROW BEGIN UPDATE employee SET insert_time = CURRENT TIMESTAMP WHERE id = newrow.id END; |
Next, create a procedure to return as a result set all the inserted rows to be uploaded. This procedure returns all rows that (based on the insert_time) have been inserted since the last successful upload but were not subsequently deleted. The time of the last successful upload is determined from the start progress value in the #hook_dict table. This example uses the default setting for the dbmlsync extended option LockTables, which causes dbmlsync to lock the tables being synchronized. As a result, you do not need to exclude rows inserted after the end progress: the table locks prevent any operations from occurring after the end progress, while the upload is built.
CREATE PROCEDURE employee_insert() RESULT( id unsigned integer, name varchar( 256 ), salary numeric( 9,2 ) ) BEGIN DECLARE start_time timestamp; SELECT value INTO start_time FROM #hook_dict WHERE name = 'start progress as timestamp'; // Upload as inserts all rows inserted after the start_time // that were not subsequently deleted SELECT id, name, salary FROM employee e WHERE insert_time > start_time AND NOT EXISTS( SELECT id FROM employee_delete ed WHERE ed.id = e.id ); END; |
To handle uploads, you need to ensure that the correct pre-image is used based on the start progress when the upload was built.
First, create a table that maintains pre-images of updated rows. The pre-images are used when generating the scripted upload.
CREATE TABLE employee_preimages ( id unsigned integer NOT NULL, name varchar( 256), salary numeric( 9, 2 ), img_time timestamp default CURRENT TIMESTAMP, primary key( id, img_time ) ); |
Next, create a trigger to store a pre-image for each row when it is updated. As with the insert trigger, this trigger is not fired on download.
Note that this trigger stores a pre-image row each time a row is updated (unless two updates come so close together that they get the same timestamp). At first glance this looks wasteful. It would be tempting to only store a pre-image for the row if there is not already one in the table, and then count on the sp_hook_dbmlsync_upload_end hook to delete pre-images once they have been uploaded.
However, the sp_hook_dbmlsync_upload_end hook is not reliable for this purpose. The hook may not be called if a hardware or software failure stops dbmlsync after the upload is sent but before it is acknowledged, resulting in rows not being deleted from the pre-images table even though they have been successfully uploaded. Also, when a communication failure occurs dbmlsync may not receive an acknowledgement from the server for an upload. In this case, the upload status passed to the hook is 'unknown'. When this happens there is no way for the hook to tell if the pre-images table should be cleaned or left intact. By storing multiple pre-images, the correct one can always be selected based on the start progress when the upload is built.
CREATE TRIGGER emp_upd AFTER UPDATE OF name,salary ON employee REFERENCING OLD AS oldrow FOR EACH ROW BEGIN INSERT INTO employee_preimages ON EXISTING SKIP VALUES( oldrow.id, oldrow.name, oldrow.salary, CURRENT TIMESTAMP ); END; |
Next, create an upload procedure to handle updates. This stored procedure returns one result set that has twice as many columns as the other scripts: it contains the pre-image (the values in the row the last time it was received from, or successfully uploaded to, the MobiLink server), and the post-image (the values to be entered into the consolidated database).
The pre-image is the earliest set of values in employee_preimages that was recorded after the start_progress. Note that this example does not correctly handle existing rows that are deleted and then reinserted. In a more complete solution, these would be uploaded as an update.
CREATE PROCEDURE employee_update() RESULT( preimage_id unsigned integer, preimage_name varchar( 256), preimage_salary numeric( 9,2 ), postimage_id unsigned integer, postimage_name varchar( 256), postimage_salary numeric( 9,2 ) ) BEGIN DECLARE start_time timestamp; SELECT value INTO start_time FROM #hook_dict WHERE name = 'start progress as timestamp'; // Upload as an update all rows that have been updated since // start_time that were not newly inserted or deleted. SELECT ep.id, ep.name, ep.salary, e.id, e.name, e.salary FROM employee e JOIN employee_preimages ep ON ( e.id = ep.id ) // Do not select rows inserted since the start time. These should be // uploaded as inserts. WHERE insert_time <= start_time // Do not upload deleted rows. AND NOT EXISTS( SELECT id FROM employee_delete ed WHERE ed.id = e.id ) // Select the earliest pre-image after the start time. AND ep.img_time = ( SELECT MIN( img_time ) FROM employee_preimages WHERE id = ep.id AND img_time > start_time ); END; |
First, create a table to maintain a list of deleted rows:
CREATE TABLE employee_delete ( id unsigned integer primary key NOT NULL, name varchar( 256 ), salary numeric( 9, 2 ), delete_time timestamp ); |
Next, create a trigger to populate the employee_delete table as rows are deleted from the employee table. This trigger is not called during download because later you set the dbmlsync extended option FireTriggers to false. Note that this trigger assumes that a deleted row is never reinserted; therefore it does not deal with the same row being deleted more than once.
CREATE TRIGGER emp_del AFTER DELETE ON employee REFERENCING OLD AS delrow FOR EACH ROW BEGIN INSERT INTO employee_delete VALUES( delrow.id, delrow.name, delrow.salary, CURRENT TIMESTAMP ); END; |
The next SQL statement creates an upload procedure to handle deletes. This stored procedure returns a result set that contains the rows to delete on the consolidated database. The stored procedure uses the employee_preimages table so that if a row is updated and then deleted, the image uploaded for the delete is the last one that was successfully downloaded or uploaded.
CREATE PROCEDURE employee_delete() RESULT( id unsigned integer, name varchar( 256), salary numeric( 9,2 ) ) BEGIN DECLARE start_time timestamp; SELECT value INTO start_time FROM #hook_dict WHERE name = 'start progress as timestamp'; // Upload as a delete all rows that were deleted after the // start_time that were not inserted after the start_time. // If a row was updated before it was deleted, then the row // to be deleted is the pre-image of the update. SELECT IF ep.id IS NULL THEN ed.id ELSE ep.id ENDIF, IF ep.id IS NULL THEN ed.name ELSE ep.name ENDIF, IF ep.id IS NULL THEN ed.salary ELSE ep.salary ENDIF FROM employee_delete ed LEFT OUTER JOIN employee_preimages ep ON( ed.id = ep.id AND ep.img_time > start_time ) WHERE // Only upload deletes that occurred since the last sync. ed.delete_time > start_time // Don't upload a delete for rows that were inserted since // the last upload and then deleted. AND NOT EXISTS ( SELECT id FROM employee e WHERE e.id = ep.id AND e.insert_time > start_time ) // Select the earliest preimage after the start time. AND ( ep.id IS NULL OR ep.img_time = (SELECT MIN( img_time ) FROM employee_preimages WHERE id = ep.id AND img_time > start_time ) ); END; |
Next, create an upload_end hook to clean up the employee_preimage and employee_delete tables when an upload is successful. This example uses the default setting for the dbmlsync extended option LockTables, so the tables are locked during synchronization. So, you do not have to worry about leaving rows in the tables for operations that occurred after the end_progress. Locking prevents such operations from occurring.
CREATE PROCEDURE sp_hook_dbmlsync_upload_end() BEGIN DECLARE val varchar(256); SELECT value INTO val FROM #hook_dict WHERE name = 'upload status'; IF val = 'committed' THEN DELETE FROM employee_delete; DELETE FROM employee_preimages; END IF; END; |
The publication called pub1 uses the scripted upload syntax (WITH SCRIPTED UPLOAD). It creates an article for the employee table, and registers the three stored procedures you just created for use in the scripted upload. It creates a MobiLink user called u1, and a subscription between v1 and pub1. The extended option FireTriggers is set to off to prevent triggers from being fired on the remote database when the download is applied, which prevents downloaded changes from being uploaded during the next synchronization.
CREATE PUBLICATION pub1 WITH SCRIPTED UPLOAD ( TABLE employee( id, name, salary ) USING ( PROCEDURE employee_insert FOR UPLOAD INSERT, PROCEDURE employee_update FOR UPLOAD UPDATE, PROCEDURE employee_delete FOR UPLOAD DELETE, ) ) CREATE SYNCHRONIZATION USER u1; CREATE SYNCHRONIZATION SUBSCRIPTION TO pub1 FOR u1 TYPE 'tcpip' ADDRESS 'host=localhost' OPTION FireTriggers='off'; SCRIPT VERSION 'default' |
Connect to the remote database and insert data to synchronize using scripted upload. For example, run the following SQL statements against the remote database in Interactive SQL:
INSERT INTO employee(id, name, salary) VALUES( 7, 'black', 700 ); INSERT INTO employee(id, name, salary) VALUES( 8, 'anderson', 800 ); INSERT INTO employee(id, name, salary) VALUES( 9, 'dilon', 900 ); INSERT INTO employee(id, name, salary) VALUES( 10, 'dwit', 1000 ); INSERT INTO employee(id, name, salary) VALUES( 11, 'dwit', 1100 ); COMMIT; |
At a command prompt, start the MobiLink server:
mlsrv12 -c "dsn=dsn_consol" -o mlserver.mls -v+ -dl -zu+ |
Start a synchronization using dbmlsync:
dbmlsync -c "dsn=dsn_remote" -k -uo -o remote.mlc -v+ |
You can now verify that the inserts were uploaded.
To clean up your computer after completing the example, perform the following steps:
mlstop -h -w dbstop -y -c server=consol dbstop -y -c server=remote dberase -y consol.db dberase -y remote.db del remote.mlc mlserver.mls |
Discuss this page in DocCommentXchange.
|
Copyright © 2010, iAnywhere Solutions, Inc. - SQL Anywhere 12.0.0 |