When you loop query using NDB's fetch_page, you will eventually run into out of memory issue (Exceeded soft private memory limit) even though the fetch items should be garbage collected in batches.
The culprit is NDB reference those items in context thus they cannot be garbage collected. You should call ndb.get_context().clear_cache() after each fetch_page.
from google.appengine.ext import ndblimit = 500query = Item.query(Item.is_active == True)items, next_cursor, more = query.fetch_page(limit)while more: ... # del items # gc.collect() ndb.get_context().clear_cache() items, next_cursor, more = query.fetch_page(limit, start_cursor=next_cursor)