diff options
author | Nicolai Hähnle <nicolai.haehnle@amd.com> | 2016-01-11 15:56:22 -0500 |
---|---|---|
committer | Nicolai Hähnle <nicolai.haehnle@amd.com> | 2016-02-03 14:04:11 +0100 |
commit | bc8a6842a95aac4e41d11817f8f05b287f3fea6c (patch) | |
tree | 62354ca68f6ca9f7a52645f98df6efb45aad6275 /docs | |
parent | 761c7d59c4403832c33d931bb097d060ed07e555 (diff) |
mesa: add MESA_NO_MINMAX_CACHE environment variable
When set to a truish value, this globally disables the minmax cache for all
buffer objects.
No #ifdef DEBUG guards because this option can be interesting for
benchmarking.
Reviewed-by: Marek Olšák <marek.olsak@amd.com>
Diffstat (limited to 'docs')
-rw-r--r-- | docs/envvars.html | 1 |
1 files changed, 1 insertions, 0 deletions
diff --git a/docs/envvars.html b/docs/envvars.html index 5bb7b1e65b..ba83335d0b 100644 --- a/docs/envvars.html +++ b/docs/envvars.html @@ -96,6 +96,7 @@ glGetString(GL_SHADING_LANGUAGE_VERSION). Valid values are integers, such as "130". Mesa will not really implement all the features of the given language version if it's higher than what's normally reported. (for developers only) <li>MESA_GLSL - <a href="shading.html#envvars">shading language compiler options</a> +<li>MESA_NO_MINMAX_CACHE - when set, the minmax index cache is globally disabled. </ul> |