Skip to content
open rt logo

Open-Source Ray-Tracing library


OpenRT is a C++ ray-tracing library, which allows for synthesis of photo-realistic images. First of all, the library is developed for academic purposes: as an accompaniment to the course on computer graphics and as a teaching aid for university students. Specifically, it includes the following features:

  • Distribution Ray Tracing
  • Global Illumination

OpenRT aims for a realistic simulation of light transport, as compared to other rendering methods, such as rasterisation, which focuses more on the realistic simulation of geometry. Effects such as reflections and shadows, which are difficult to simulate using other algorithms, are a natural result of the ray tracing algorithm. The computational independence of each ray makes our ray-tracing library amenable to a basic level of parallelisation. OpenRT is released under a BSD license and hence it is free for both academic and commercial use. The code is written entirely in C++ with using the OpenCV library. 


Low overhead, OpenRT has only one external dependency: OpenCV.  Optimized for high-efficient calculations and takes advantage of multi-core processing as well as GPU computing.

Batteries Included

Comes out of the box with everything you need to create your first ray tracing application. A selection of demo projects may serve as the basis for your own application.


OpenRT is a cross-platform, dynamic-link library, meant to be used in Windows, Mac and Linux. Its C++17 code is compiled with Microsoft Visual Studio, Xcode and gcc.




Constructive solid geometry (by Otmane Sabir)

CSGCSG after

Area Lights


Ambient Occlusion


VR 360° Camera (by Fjolla Dedaj)

room 4k rightIMG_2220

Cornell Box

Original Imagecornell box

Computer Graphics Course Schedule

Date Lecture Slides Worksheets Assignments
02.09.2021 Introduction pdf
09.09.2021 Introduction to Ray Tracing slides
16.09.2021 Camera and Lens Models slides Worksheet 1
23.09.2021 Ray-geometry intersection algorithms slides Worksheet 2 Assignment 1
14.10.2021 Spatial Index Structures slides Assignment 2
21.10.2021 Shading: Rendering Equation & BRDF slides Worksheet 3 Assignment 3
28.10.2021 Texturing slides Worksheet 4 Assignment 4
04.11.2021 Distribution Ray-Tracing slides Assignment 5
11.11.2021 Transformations pdf
11.11.2021 Animation pdf Worksheet 5 Assignment 6
18.11.2021 Global Illumination pdf
25.11.2021 Human Visual System pdf
25.11.2021 Color pdf
02.12.2021 Sampling and Reconstruction pdf
02.12.2021 Environment camera & Virtual Reality pdf Worksheet 6
16.12.2021 Final Exam
27.01.2021 Make-Up Exam

Project and Thesis Topics

Procedural Textures

Unlike a bitmapped texture, in which the texture is represented as a bitmap, a procedural texture describes the texture mathematically. Although not widely used, this method is resolution independent and can create more precise textures, especially if there is great and varying depth to the objects being textured. Procedural textures may be 2D or 3D.

Many procedural textures for such materials as marble, stone and wood are generated based on the Perlin noise.

Read more:
Wikipedia Victor’s blog YouTube

Consistent Normal Interpolation

Rendering a polygonal surface with Phong normal interpolation allows shading to appear as it would for a true curved surface while maintaining the efficiency and simplicity of coarse polygonal geometry. However, this approximation fails in certain situations, especially for grazing viewing directions. Well-known problems include physically impossible reflections and implausible illumination.

Read more:
researchGate Code & Visuals

Subsurface scattering / Subsurface light transport (SSLT)

Concentration on creation of new shaders for materials as wax, skin, marble, etc. SSLT may be implemented based on the photon beam diffusion (PBD) technique by Habel et al. The resulting profile takes all orders of scattering into account, effectively accounting for all of the light transport that occurs within the surface.

Read more:
wikipedia, pbr book

Thick Lens Model

Concentration on creation of new cameras with a fairly rough approximation of actual camera lens systems, which are comprised of a series of multiple lens elements, each of which modifies the distribution of radiance passing through it.

Read more:
pbr book

Bump Mapping and Stochastic deviation of Normals

Stohastic Supersampling

Depth of Field (DoF)

Concentration on creation of new cameras with the thin lens approximation, to model the effect of finite apertures with traditional computer graphics projection models. The thin lens approximation models an optical system as a single lens with spherical profiles, where the thickness of the lens is small relative to the radius of curvature of the lens. 

DoF effect is a part of stochastic ray tracing and makes use of the random samples generators.

Read more:
wikipedia, pbr book

Geodesic raytracing

This topic assumes non-linear light propagation, thus the ray equation r(t) = o + td is not valid. Instead the light travel is then described by the Schwarzschild geodesics. This happens for example, when light travels close to a black hole, and thus, the light is deflected / bended by the gravity. 

Read more:
How to draw a black hole arXiv

Particle System

Concentration on creation of new type of geometry to simulate certain kinds of “fuzzy” phenomena, which are otherwise very hard to reproduce with conventional rendering techniques – usually highly chaotic systems, natural phenomena, or processes caused by chemical reactions.

A particle system’s position and motion are controlled by  an emitter. The emitter acts as the source of the particles, and its location in 3D space determines where they are generated and where they move to. A regular primitive or a solid, such as a cube or a plane, can be used as an emitter. Particles’ physics and interaction should be modeled. 

Read more:

Distributed Rendering

Feature content

GPU-based rendering

Feature content

Bounding Volume Hierarchy (BVH)