LibGL: Change GLsizei from unsigned to signed integral type
There is some really wild stuff going on in the OpenGL spec for this.. The Khronos website states that GLsizei is a 32-bit non-negative value used for sizes, however, some functions such as `glGenTextures` state that the input `n` could be negative, which implies signage. Most other implementations of `gl.h` seem to `typedef` this to `int` so we should too.
This commit is contained in:
parent
2b123cc592
commit
8a69e6714e
Notes:
sideshowbarker
2024-07-18 17:22:23 +09:00
Author: https://github.com/Quaker762 Commit: https://github.com/SerenityOS/serenity/commit/8a69e6714e7 Pull-request: https://github.com/SerenityOS/serenity/pull/7024 Reviewed-by: https://github.com/alimpfard Reviewed-by: https://github.com/ccapitalK Reviewed-by: https://github.com/predmond Reviewed-by: https://github.com/sunverwerth
1 changed files with 1 additions and 1 deletions
|
@ -163,7 +163,7 @@ typedef unsigned int GLuint;
|
|||
typedef int GLfixed;
|
||||
typedef long long GLint64;
|
||||
typedef unsigned long long GLuint64;
|
||||
typedef unsigned long GLsizei;
|
||||
typedef int GLsizei;
|
||||
typedef void GLvoid;
|
||||
typedef float GLfloat;
|
||||
typedef float GLclampf;
|
||||
|
|
Loading…
Add table
Reference in a new issue