[ all ] [ a / b / c / f / h / jp / l / o / q / s / sw / lounge ] [ cgi ] [ up ] [ wiki ]
[Home] [Catalog] [Search] [RSS feed] [Inbox] [Write PM] [Admin]
[Return]
Posting mode: Reply
[]
(for deletion, 8 chars max)
  • Allowed file types are: GIF, JPG, JPEG, PNG, BMP, SWF, WEBM, MP4
  • Maximum file size allowed is 50000 KB.
  • Images greater than 200 * 200 pixels will be thumbnailed.

  • Switch Form Position | BBCode Reference
  • Read the rules before you post.
  • Protect your username, use a tripcode!
  • 日本のへゆり
    boku

  • 2025/05/04 - Heyuri Calendar has been launched. Find out about upcoming Heyuri events!
  • 2025/04/01 - NEW GAME: Slime Breeder! Commit slimecest with your ancestors to create teh ultimate slime!
  • 2024/09/12 - NEW GAME: Battle Royale R! Make characters and see if they can win the Heyuri Cup!
  • 2024/09/10 - Tegaki function has been added
  • [Show All]

Crisis averted.

We will watch Angel Tales this Saturday 18:00 UTC [Info] [Countdown]


File: 1743217265104497.jpg
(196 KB, 1000x970) [ImgOps]
196 KB
i keep getting errors when compiling a shader, i have no idea why ( ´,_ゝ`)
this is all from a tutorial in https://learnopengl.com/Getting-started/Hello-Triangle
i just tried to clean it up a little by creating a function to compile shaders instead of putting them all in main()



bool getCompileResult(unsigned int *shader, const char *src) {
glShaderSource(*shader, 1, &src, nullptr);
//the error is in glShaderSource! (error is GL_INVALID_VALUE)
//etc....
}

const char *vertexShaderSource = "#version 330 core\n"
"layout (location = 0) in vec3 aPos;\n"
"void main()\n"
"{\n"
" gl_Position = vec4(aPos.x, aPos.y, aPos.z, 1.0);\n"
"}\0";

int main() {
//some stuff
unsigned int vertexShader = glCreateShader(GL_VERTEX_SHADER);
if(!getCompileResult(&vertexShader, vertexShaderSource))
return -1;

}


i feel like i'm maybe making mistakes with pointers but i checked what *src in getCompileResult points to and it's exactly the source i declared
the opengl documentation says:

>GL_INVALID_VALUE is generated if shader is not a value generated by OpenGL.
>GL_INVALID_VALUE is generated if maxLength is less than 0.

it cant be the second because i passed nullptr to glShaderSource which means it knows the string is null terminated, but idk what "shader is not a value generated by OpenGL" means
any help?
>>
Maybe it doesn't like the nullptr for some reason, try swapping it for a length just to see if it works.

"shader is not a value generated by OpenGL" would mean that glCreateShader() failed, did you check if it returns a valid value?
>>
>>143933
still doesnt work, the documentation says nullptr CAN be a parameter if you feed it a null-terminated string and want to let it figure out the length

however i noticed something very strange, when i slowly step into the function while debugging it successfully compiles, but it doesnt do it in "release" so to speak
makes no damn sense

the previous version of the program had one big init() function that created a window, prepared the VAO, VBO and EBO and compiled the shaders and it worked there, i cant figure out what's happened now
>>
>>143933
as for the return value of glShaderSource... it's a void function so there's no value to check, the errors are put into an array and to get them i used glGetErrors()
>>
>>143943
Sounds weird, almost like the compiler is optimizing some of your code in a way that it's not supposed to.

>>143944
I mean the other function, the function that gives you unsigned int vertexShader.
>>
>>143948
oh god it does return 0, this means there was an error creating the shader object, but glGetErrors() doesn't show any errors
this is extremely puzzling
i will go to sleep now, maybe i can get to the bottom of this tomorrow with a well rested mind
>>
Stop masturbating. Your brain will work more clearly and you will be able to compile teh triangle
>>
>>143961
Are you initializing the window first? There's all kinds of problems if you try to use OpenGL before the window is ready to go.
>>
File: anger.mp4
(668 KB, 768x512)
668 KB
know better
>>
Firstly, why the fuck are you passing in a pointer to the shader handle? Just pass the int directly. Not that it will change anything here but come on now.

Also, don't use types like 'int' and 'unsigned int'. Use the unistd types instead.

Anyway, the debugging steps are pretty simple: start by inlining the function at the call site and see what happens, then see what happens with an empty shader, then play with the parameters, etc.

10 years ago I could have helped more directly but unfortunately I haven't touched opengl in quite a while dark

Do you know how to use gdb?

Good luck
>>
Make sure your OpenGL context is initialized. And use the "-iv" functions to check the result of each step. Begin by doing it all inline in main and then break it up into functions once you have it working.
>>
i rewrote it from scratch and it works now
i'm unsure where the error was as the function is the same as well as mostly everything else
the only change was passing the shader handle directly instead of passing a pointer to it as >>144088 said (could that really be an annoyance to opengl?)
>>144088
does it matter which uint i use? i've always been under the impression that i have to use unistd types when i require a certain variable to be of a specific size, in this case the opengl functions i use to get the shaders return GLuint which my IDE informs me is a typedef for a generic unsigned int
>>144106
up until now the iv functions weren't filling up the infoLog array anyway
>>
>>144130
The int you use doesn't matter, but using unistd ints can avoid a lot of pain while simultaneously reducing the amount of typing involved in monstrosities like "unsigned long long int" (which is just uint64_t).
If you ever go cross-platform (think wanting to support apple hardware on apple silicon), this is a potential source of bugs (because the type sizes differ).
>>
>>144139
Type sizes aren't going to change in practice unless you port your program to some really obscure platform that almost likely won't support OpenGL anyway.

If you're concerned about long type names then instead of using the baka names from C/++ headers which are usually longer rather than shorter, I think it's better to just define the short names that many newer languages use:

typedef signed char i8; // Alternatively s8
typedef signed short i16; // Alternatively s16
typedef signed int i32; // Alternatively s32
typedef signed long long i64; // Alternatively s64
typedef unsigned char u8;
typedef unsigned short u16;
typedef unsigned int u32;
typedef unsigned long long u64;
>>
>>144144
thanks, i'll keep it in mind for my other projects, this one is just for learning, it's pretty hard


Delete Post: []
Password:
First[0] Last