Unsigned int GLSL parsing fixed
Unsigned integers in GLSL were being parsed using the regular
integer parser, so it was limited to INT_MAX. All values from
INT_MAX + 1 to UINT_MAX could not be parsed properly.
Also, added constant folding for the 4 bit conversion glsl
functions.
Fixes shader compilation issue in the Epic Zen Garden example:
https://s3.amazonaws.com/mozilla-games/ZenGarden/EpicZenGarden.html
(unfortunately, the screen is still black, so there are other
issues left)
Fixes WebGL 2 test: conformance2/glsl3/float-parsing.html
Change-Id: Iae52b2c8e083f0e1a22599e5a583297b9850444d
Reviewed-on: https://swiftshader-review.googlesource.com/16648
Tested-by: Alexis Hétu <sugoi@google.com>
Reviewed-by: Nicolas Capens <nicolascapens@google.com>
diff --git a/src/OpenGL/compiler/util.cpp b/src/OpenGL/compiler/util.cpp
index 2905c1d..bd9783c 100644
--- a/src/OpenGL/compiler/util.cpp
+++ b/src/OpenGL/compiler/util.cpp
@@ -32,3 +32,11 @@
*value = std::numeric_limits<int>::max();
return success;
}
+
+bool atou_clamp(const char *str, unsigned int *value)
+{
+ bool success = pp::numeric_lex_int(str, value);
+ if(!success)
+ *value = std::numeric_limits<unsigned int>::max();
+ return success;
+}