Creating a Triangle in OpenGL In-Shader

Go to OpenGL Home

As a prerequisite to this tutorial, I suggest you read the previous one which can be found here, because it contains a class which will read and compile shaders. Last time we started to delve into shaders and how to set them up in the main program. This time we will create a vertex and fragment shader which will result in a green triangle.

Let’s start with the vertex shader, whose code is the following:

#version 430 core

void main(void)
{
   const vec4 vertices[3] = vec4[3](vec4( 0.25, -0.25, 0.5, 1.0),
                                    vec4(-0.25, -0.25, 0.5, 1.0),
                                    vec4( 0.25, 0.25, 0.5, 1.0));
   gl_Position = vertices[gl_VertexID];
}

To do this right, first create a folder in your project called Shaders and create a file called Vertex_Shader.glsl. Open it and write the vertex shader code from above.

In any shader, you have to state the OpenGL version which you will be using (in my case, 4.3). The “core” keyword implies that we are using the core functionalities of the GLSL language. Because we are not touching buffers yet, we move on to the main function.

In order to create a triangle, we of course need to specify its vertex’s positions to be passed on to gl_Position. It should be noted that the positions saved in gl_Position represent not the vertex positions in the virtual world, but the position in our window.

Untitled

In this case, 0.0 represents the center of the window and the values range from -1 to 1, as in the image above. So, for example, if you have a 200×400 window and you wanted to draw the triangle from above, then the vertices will be the pixels at the coordinates: (125,150), (75,150), (125,250). These are normalized device coordinates (NDC).

The normalized device coordinates are obtained by dividing each value (x,y,z) with the 4th value, which is referred to as W. In our case W is 1.0 because we are dealing with a vertex location.  If we were transforming normals which represent a direction then W = 0 and we skip the divide-by-W normalization. OpenGL defaults to a normalized cube with dimensions X=-1 .. +1, Y=-1 … +1, Z = -1 .. +1. DirectX uses a slightly different convention.  Z = 0 .. +1.  Which way is “correct” ?

It doesn’t matter which abstraction we pick.  Using the more consistent -1 .. +1 is not really any more convenient as the math works out regardless of which coordinate system you pick.  Anyways the purpose of using NDC in the first place is that it is a way to abstract device independent coordinates.  If one device has 1980 pixels across, and a second device has 1024 pixels across the normalized coordinates of <1,1,1> always refers to the top right pixel.

After NDC transformation is performed there is a final transformation called Window Transformation or Screen Transformation which is required to fit your scene in OpenGL’s Viewport. These two  transformations are done by you graphics card, so you don’t have to worry about anything. After this, the final coordinates are passed to the raterization process where all shapes are converted to pixels/fragments.

In our case, the triangle will look like this:

triangle

As for the z coordinate, it only has meaning to determine which fragment is in front of another one. The z coordinate, in windows coordinates is mapped between 0.0 and 1.0. glVertexID represents the id of the vertex which is currently being processed. In this case, because we have just 3 vertices and all of them are in the vertices vector, it makes sense to make that operation.

Moving on to the fragment shader, the code is as simple as can be:

#version 430 core
out vec4 color;

void main(void)
{
  color = vec4(0.0, 1.0, 0.0, 1.0);
}

In this case, we need to specify an out value, namely the fragment color (in this case, it’s green). Of course you need to create another file for it in the same folder Shaders (already created) and name it Fragment_Shader.glsl.

Finally, we reach the main file, whose code is the following:

#include&lt;iostream&gt;
#include&lt;stdio.h&gt;;
#include&lt;stdlib.h&gt;;
#include&lt;fstream&gt;;
#include&lt;vector&gt;;

#include "Core\Shader_Loader.h";

using namespace Core;

GLuint program;

void renderScene(void)
 {
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glClearColor(1.0, 0.0, 0.0, 1.0);//clear red

    //use the created program
    glUseProgram(program);

   //draw 3 vertices as triangles
   glDrawArrays(GL_TRIANGLES, 0, 3);

   glutSwapBuffers();
}

void Init()
{

   glEnable(GL_DEPTH_TEST);

   //load and compile shaders
   Core::Shader_Loader shaderLoader;
   program = shaderLoader.CreateProgram("Shaders\\Vertex_Shader.glsl",
                                         "Shaders\\Fragment_Shader.glsl");
   glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);
}

int main(int argc, char **argv)
 {
    glutInit(&amp;argc, argv);
    glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
    glutInitWindowPosition(100, 100);
    glutInitWindowSize(800, 600);
    glutCreateWindow("Drawing my first triangle");
    glewInit();

    Init();

    // register callbacks
    glutDisplayFunc(renderScene);
    glutMainLoop();
    glDeleteProgram(program);
    return 0;

}

The first important addition to the initialization process is calling the CreateProgram method from the Shader_Loader class, which creates our compiled program. Then, during the render process, we point out the desired program to be used using glUseProgram. Finally, we use glDrawArrays to instruct how many vertices to draw and how (in this case, triangles).

Before we end the tutorial, let’s talk a bit about glUseProgram and glDrawArrays.

  • glUseProgram – The program in question is a container for the shaders we are going to use in drawing the scene. You can switch programs between objects drawn, if you want o use different shaders for different objects.
  • glDrawArrays – the first argument is the drawing mode or the primitive(points, lines, triangles, triangle strip, etc.), the third is the array of vertices in question and the second is the index from which the vertices will be drawn inside argument 3.

In this tutorial we passed vertices directly in NDC space — normally we wouldn’t use NDC space directly.  We didn’t talk about how vertices are transformed from object space to NDC space using the matrices of the projection and modelview — We will do this in a later article, but first let’s see how we can pass vertices from OpenGL to shader in the next tutorial.

The project folder structure should look like this:

folders structure

Folders structure

This is the simplest drawing you can do, apart from a single line of vertex. So until next time, I leave you with the following image.

homework triangles

Try to recreate it. Have fun!

Source code so far: in2GPU_TriangleShader

In case you don’t see any triangle: This shaders might not work on AMD or Intel video cards. This is a dummy shader just to understand how things work and in real applications no one will ever create this kind of triangle. There are other techniques for this which are going to be covered in the next tutorials.


Tagged under:

I'm an engineer, currently employed at a financial software company. My interests include gaming, LPing and, of course, reviewing, but also game dev and graphics. Also, in the past I've dabbled in amateur photography, reviewing movies and writing short stories and blog posts. I am also a huge Song of Ice and Fire fan, but that's beside the point. Youtube Channel, Deviantart , Google + , Twitter

blog comments powered by Disqus