-
-
Notifications
You must be signed in to change notification settings - Fork 187
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Datasource caching #2214
Comments
datasource reads are already cached, though I don't think it's documented... If you use the |
Thanks for the reply. Looking at the source code, I see that the string content of the datasource is cached, but not the parsed result. I'm working with a large JSON file, so the time it takes to parse each time results in a ~100x slowdown compared to using context. |
@BenjyWiener ah. is using context an option? Usually I recommend that anyway, given the simpler syntax. Reopening since there may be a possible feature request here... |
I'm currently using context, but I use a lot of nested/recursive templates, which requires a lot of |
ok, so then the data and parsed objects are already cached.
I'm not sure I follow why this affects caching - could you elaborate? A concrete example would help... |
This issue is stale because it has been open for 60 days with no If it's still relevant, one of the following will remove the stale
|
I have a large JSON datasource that's used in a nested template, which is executed many times in a loop.
I currently load the datasource at the top level and use
dict
to pass the value along with the actual context to the nested template.I would like to be able to load the datasource from the nested template where it's used, but the runtime impact of reading and deserializing the JSON file each iteration is huge (50-100x slower in my case). An option to cache datasource reads would solve this.
Possible mechanisms:
cache
option in config file and/or as query parameterA new function that caches datasource reads (
datasourceCached
)A more general set of functions that allow writing to and reading from some global "state".
The text was updated successfully, but these errors were encountered: