python - Many different sharded counters in single transaction -


i have increment 3 different counters in single transaction. beside have manipulate 3 other entities well. get

too many entity groups in single transaction 

i've used recipie https://developers.google.com/appengine/articles/sharding_counters implement counters. increment counters inside model (class) methods depending on business logic.

as workaround implemented deferred increment method uses tasks update counter. doesn't scale if number of counters increases further there limit of tasks in single transaction (i thinks it's 5) , guess it's not effective way.

i found https://github.com/docsavage/sharded_counter/blob/master/counter.py seems ensure updating counter in case of db error through memcache. don't want increment counters if transaction fails.

another idea remember counters have increment during web request , increment them in single deferred task. don't know how implement in clean , thread safe way without passing objects created in request model methods. think code ugly , not in same transcation:

def my_request_handler():     counter_session = model.counter_session()     model.mylogic(counter_session, other_params)     counter_session.write() 

any experiences or ideas?

btw: i'm using python, ndb , flask ok if counter not 100% accurate.

as said in transactions , entity groups:

the simplest approach determine entities need able process in same transaction. then, when create entities, place them in same entity group declaring them common ancestor. in same entity group , able update , read them transactionally.


Popular posts from this blog

How to calculate SNR of signals in MATLAB? -

c# - Attempting to upload to FTP: System.Net.WebException: System error -

ios - UISlider customization: how to properly add shadow to custom knob image -