I need a small tool (best would be commandline) that synchronizes a dBase to a MySQL database.
As a demonstration you can have a look at [url removed, login to view]
I don't need bidirectional synchronization yet.
The tool should be for Suse Linux.
Most important is that the tool is fast as it should run approx once a minute, perfect would be a tool that syncs in realtime.
Datasource is a bunch of .dbf files demo data will be delivered.
Last info: the dbf files will be stored on an networkdrive that is mounted locally (I don't think that this matters).
As there are some FAQ:
1. attached are now some sample dbf files as one zip.
2. the data in mysql will be one table per dbf file.
3. naming of the tables will be PREFIX + dbf name
4. columns in the mysql have to be created by the tool according to the dbf data (so 1:1)
The dbf files are ANSI character coded the tool has to handle that (german special chars as äöü...)
And again some additional information:
The dbf files will be between 30 to 1000 entries perhaps sometimes even more.
The names of the dbf files can change so the solution will be that your tool requests a parameter filename/s and will synchronize the given files to mysql.
There will be a buch of dbf files that exists
This files are synchronized to mysql once to have a start
The dbf files will change frequently
The changes have to be reflected in the mysql database as fast as possible
From my point of view the tool would have to:
synchronize the files once and create the corresponding mysql tables
do some hashing or whatever to trigger changes
find the changed entry in the dbf
synchronize this change to the mysql db
How this is don I don't care important for me is only to have the data from the dbfs as soon as possible in the mysql db.
Additional Info (Added 3/31/2011 at 12:04 EST)...
As I got now some develpers offering a php solution.
I have no problem if this is done with php. php will be installed on the machine that hosts the mysql and any needed module can be installed.
But I'm not sure if the update frequence can be reached with php.
There are 23 dbf files that have to be syncronized of course not all of them all the time.
Each of this files can have up to 1000 or even more records.
If you have a good solution done with php that is matching the requirement of a sync in under 1 Minute or even faster I would be totaly satisfied with a php solution.
I think this could be done if you are realy firm with hashing etc. and find updated records very fast.
But if you would try to run throug the whole dbf all the time checking each record against the mysql I don't see any chance to get this done in the given timeframe...
But from my point of view this would be better done with a little program in whatever language that is scheduled or started by a command on a regular basis.
I have to learn that I should describe my projects in more detail from the beginning...
There is no timestamp in the dbf files so the tool will have to find out what has been changed between the last synchronization and now by it's own.
I think this could be reached by different ways.
1. keeping a local copy of the syncronized dbf files and comparing this to the original once
2. by some kind of hashing
3. by relying on the "last changed date" on filesystem level
In all cases the steps would be
1. synchronize dbfs with mysql
2. store whatever needed to find the differences
3. check which dbf files are changed
4. find the updated records what is probably the most difficult step
5. write the changes to the mysql and restart at step 2
There are about 21 dbf files that have to be synchronized but I assume that only about 5 files are really changed between each run.
You have full root rights on the machine and can install whatever needed (as long as it dosn't produce extra costs or the price is reasonable).