Migrate from Oracle to MySQL -
we ran serious performance problems our oracle database , seek migrate mysql-based database (either mysql straight or, more preferably, infobright).
the thing is, need allow old , new scheme overlap @ to the lowest degree weeks if not months, before know, if features of new database match our needs.
so, here our situation:
the oracle database consists of multiple tables each millions of rows. during day, there literally thousands of statements, cannot stop migration.
every morning, new info imported oracle database, replacing thousands of rows. copying process not problem, could, in theory, import in both databases in parallel.
but, , here challenge lies, work need have export oracle database consistent state 1 day. (we cannot export tables on mon , others on tuesday, etc.) means, @ to the lowest degree export should finished in less 1 day.
our first thought dump schema, wasn't able find tool import oracle dump file mysql. exporting tables in csv files might work, i'm afraid take long.
so question is:
what should do? there tool import oracle dump files mysql? have experience such large-scale migration?
ps: please, don't suggest performance optimization techniques oracle, tried lot :-)
edit: tried etl tools before, find out, not fast enough: exporting 1 table took more 4 hours ...
2nd edit: come on folks ... did nobody ever seek export whole database fast possible , convert info can imported database system?
oracle not supply out-of-the-box unload utility.
keep in mind without comprehensive info environment (oracle version? server platform? how much data? datatypes?) here ymmv , want give go on scheme performance , timing.
my points 1-3 generic info motion ideas. point 4 method cut down downtime or interruption minutes or seconds.
1) there 3rd party utilities available. have used few of these best check them out intended purpose. few 3rd party products listed here: orafaq . unfortunately lot of them run on windows slow downwards info unload process unless db server on windows , run load utility straight on server.
2) if don't have complex datatypes lobs can roll own sqlplus. if did table @ time can parallelize it. topic has been visited on site more once, here example: linky
3) if 10g+ external tables might performant way accomplish task. if create blank external tables same construction current tables , re-create info them, info converted external table format (a text file). 1 time again, orafaq rescue.
4) if must maintain systems in parallel days/weeks/months utilize alter info capture/apply tool near-zero downtime. prepared pay $$$. have used golden gate software's tool can mine oracle redo logs , supply insert/update statements mysql database. can migrate mass of info no downtime week before go-live. during go-live period, shut downwards source database, have golden gate grab lastly remaining transactions, open access new target database. have used upgrades , grab period few minutes. had site licenses golden gate wasn't out of pocket us.
and i'll play role of cranky dba here , if can't oracle performing love see write of how mysql fixed particular issues. if have application can't touch sql, there still lots of possible ways tune oracle. /soapbox
mysql oracle import migration dump
No comments:
Post a Comment