- reconciling arithmetic and geometric compounding
- How do you calculate or estimate the future gross basis of a treasury future?
- Merge two values
- Haskell: Solving the Burst Ballon problem using Dynamic Programming
- Basic binary number container
- Math matrix, mainly for graphics
- Extracting data from a used car sales site
- Reader Writer Lock Implementation
- Why did people look at each others hands in The Book of Eli?
- how come a table variable is improving the performance of a query in this circumstance?
- sp_detach_schedule throws an error because of @delete_unused_schedule = 1, but there are jobs with that schedule
- Cannot drop database with logical replication
- MySQL binlog missing queries
- How to restore function using PG_RESTORE
- How can I identify what font was used in a Photoshop layer when the font is not installed on my computer?
- 100% sRGB 120Hz TN vs 60% sRGB IPS laptop screen
- Why cannot I use brush to paint my object?
- What font is inside this image
- Need a bit of verse in my prose
- debugging why TLS fails between openssl and some SSL sites
Universal Python database client (support mysql, pg, mongodb or something else with same query)
I usually need to
read dataset from database (mysql, mongodb)
split dataset in several group, then process or compute
use multiprocessing or distributed workers to process the data
can stop, resume, recover task (need save task status, need know how to split data in step 1)
One time processing is easy, but data set is usually large. And split dataset to task would be slow too, it better generate split query, and run in each worker.
But I didn't find a super power split query(in step 1). I know it is like map reduce, but not exactly .
I would like to see a lib or framework can connect to many different db and use same query language(sqlachemy can't do this).