Item11697: Some wiki apps suffer because $topicObjects are either fully loaded or not (can't just fetch a part of a topic)
Current State: Proposal Required
Released In: n/a
Target Release: n/a
This is a continuation of Item11088
, and isn't necessarily a SEARCH
problem but more one reflecting the fact that MongoDBPlugin
either fully loads a topic (resulting in transmisison of the entire JSON object from the DB over the wire), or it doesn't.
This task should focus on the problem of rendering bits (via
) from hundreds -> thousands topics in a single page view.
In our case, the http://wiki.trin.org.au/Mangroves/Avicennia_marina
page (and others) fall under this category where we render a
or two from a few hundred (maybe 200) different topics in a single page view.
This results in the page view sucking down 100s MiB
from mongo, which takes several seconds (sometimes 10+).
could be instructed that we want a predefined list of meta parts from each hit in the result set, we could drastically improve performance and really slash the amount of data transmitted.
- 27 Mar 2012
I agree with the principle behind this.
It has struck me quite a lot with my VDBI work how daft it is to:
- Not load the required data while your doing the query, instead of performing another to load the selected data
- Or load all the data when you only need some of it
But doesn't this hit a lot of core code? Wouldn't each topicObject need to track each META field as loaded or not?
Long and short of this is that it really requires a feature proposal as it needs core support.
- 26 Sep 2012
In brief discussions with SvenDowideit
, it seemed t
it might be possible to enhance the resultset iterator somehow (waving hands) in a way that query/store engines could dynamically/incrementally make each hit prefetch targeted topic parts, accumulating new parts to prefetch on the fly... or something...
yeah, it's an incredibly deep topic, and certainly needs more than just a task item.
- 26 Sep 2012