I've a table with certain medical information in a database table. I crawl & parse them daily and store it in that table of my local database.
Suppose there were 1500 records initially, today 100 more records are added on my local machine.
Now, I've a server in which I need to push those records (so the database is different). I copied the database yesterday. So, the server database table has 1500 entries now. How can I sync the new 100 entries to the live application?
Please note that I cannot delete old entries from my local machine else it will be crawled again and again.
解决方案
You may want to use 'SELECT ... INTO OUTFILE' and 'LOAD DATA INFILE INTO TABLE' commands.
Edit: Elaboration...
Given the table structures:
CREATE TABLE my_local_table (
id int NOT NULL auto_increment PRIMARY KEY,
data varchar(20),
created_on datetime);
CREATE TABLE server_table (
id int NOT NULL auto_increment PRIMARY KEY,
data varchar(20),
created_on datetime,
local_id int);
And some bogus data:
INSERT INTO my_local_table (data, created_on) VALUES ('test', now()), ('test2', now());
You would use the following commands:
SELECT id, data, created_on
FROM my_local_table
WHERE created_on >= '2011-08-18'
INTO OUTFILE '/tmp/t.txt';
-- (and on the server)
LOAD DATA LOCAL INFILE '/tmp/t.txt'
INTO TABLE server_table
(local_id, data, created_on);
To automate the two, you can use a bash script / batch file calling mysql connecting first to the local server using the first statement, then to the remote server executing the second.
mysql -e 'SELECT....';
mysql -h remote_server -e 'LOAD DATA...';