I'm doing some competitive price crawling as a service. Right now, my crawlers write to an AWS MySQL cloud database. In order to transfer the data to the client, I currently have to export it to an excel spreadsheet and offer it as a download. However, this is cumbersome, because the client then needs to download and upload that spreadsheet on a daily basis.
I am wondering what the possibilities are for a more streamlined solution. Would it be possible to move the data from my AWS cloud database to the client's local database on a daily schedule? what would be involved in such a job? What if they use a different DBMS (such as SQL Server or postgres)? How would one accomplish an automated data push from the cloud to a client's local database?