Logo for ammarahmed.ca
BackAcademic
Nanosensors, Raytracing, and Fourth-year Capstone cover image
Nov 12, 2025

Nanosensors, Raytracing, and Fourth-year Capstone

#Physics
#Simulations

When you hear ray-tracing, your mind probably jumps to gaming; glossy reflections, cinematic lighting, RTX badges plastered all over graphics cards. But what’s fascinating is that the same technique driving next-gen visuals in GTA 6 can also be used for scientific simulations.

I found that out in a pretty unexpected way during the planning of our fourth-year capstone project.

Capstone Overview

Our capstone project revolves around designing a nanosensor solution which includes a polymer that specifically binds to gibberellin (GA) hormones in plants (responsible for driving growth in plants). The polymer is functionalized with 6,5 chirality single-walled carbon nanotubes (SWCNTs). The combination of the polymer and the SWCNTs emits a distinct near-infrared (NIR) fluorescence. That fluorescence intensity can be correlated to plant stress (less GA means bad growth), allowing farmers to scientifically measure plant health easily.

In order for farmers to capture that signal without needing an entire lab, we’re building a portable field device equipped with 830 nm excitation LEDs, an InGaAs (Indium-Gallium-Arsenide) photodiode behind some optical bandpass filters, and some onboard circuitry to capture the fluorescence emission of interest. This device coupled with our reference lab measurements will allow us to fit a statistical (and/or ML) model to accurately determine plant stress in the field.

However, before assembling anything physical, we needed to know if the LEDs would provide enough excitation power (irradiance) to give us a good signal-to-noise ratio (SNR).

That’s when we discovered Raysect, a physics-based ray-tracing library in Python that treats light not as pixels, but as photons.

Why Raysect?

Raysect is an open-source Python library built for scientific optical simulations. Instead of producing rendered images or integrating with game engines, it computes power, radiance, spectral radiance and various other physically meaningful measurements in SI units using Monte-Carlo ray sampling.

It’s surprisingly flexible. Raysect supports full spectral rendering, meaning you can simulate specific wavelengths (830 nm in our case) with custom spectral functions (Gaussian for LEDs) and assign physics-based materials like diffuse plastic, glass, or custom emissive spectra. It also first-class support for 3D CAD models making it perfect for engineering applications.

For our setup, this was perfect. We could model our device housing, LED placement and detector placement allowing us to directly estimate the irradiance () incident on the plant stem; all before doing any part ordering or 3D printing.

Building the Scene

The simulated scene consisted of three main elements:

1. CAD Model

Exported from our CAD modelling software as .obj file and imported straight into Raysect. This represented the portable housing, which included the LED mounts and detector slot.

Orthogonal view of our CAD model for the first prototype of the field deployable device with the LED sticking out.
Orthogonal view of our CAD model for the first prototype of the field deployable device with the LED sticking out.

2. LEDs

The LEDs were modelled as spherical emitters with a Gaussian spectral profile centered at 830 nm with a FWHM (spread of the spectrum at half of the maximum) of 30 nm. Each LED emitted about 5mW of optical power (consistent with the LEDs available for us to order).

Raysect’s UniformSurfaceEmitter class let me define this distribution exactly, scaling it to match our target intensity (5 mW)

3. Detectors

We added two types of detectors:

  1. Power Detector: records irradiance maps in
    1. RGB Camera: used purely for visual sanity checks (i.e. checking LED placement, color of LEDs, etc.).

      The LEDs were placed along a semicircle inside the housing to simulate the same physical layout as our prototype.

      Inside view of the portable field device with the yellow backplate removed.
      Inside view of the portable field device with the yellow backplate removed.

      Simulating Irradiance

      With the scene assembled, we ran the simulation which simulates millions of light rays following the spectral functions defined for the LEDs and how they propagate through the scene.

      Using these light rays, some fancy internal math that Raysect does, and few minutes of time, the results came out:

      1Average Irradiance on Detecotr: 3.23 W/m²

      That was exactly in line with our back-of-the-envelope calculations; about 1 per LED. The irrdiance map also showed the expected spatial gradient; brightest near the center, tapering off toward the edges.

      Left side: plot of the irradiance map (rotated 90 degrees by default). Right side: RGB camera view of the inside of the housing.
      Left side: plot of the irradiance map (rotated 90 degrees by default). Right side: RGB camera view of the inside of the housing.

      Interestingly, in the RGB camera view, the LEDs appeared as a deep red colour. Since 830 nm sits beyond the visible spectrum (370 - 740 nm), Raysect maps it to the closest vbisible hue. That’s also why in reality, NIR LEDs look faintly red even though they primarily emit infrared light.

      Rendering the Device

      For fun (and validation), I added a few high-intensity light sources at a distance to illuminate the scene and rendered the entire device at a distance using Raysect’s PinholeCamera. This didn’t affect the physics of the NIR simulation; it just made for a satisfying visual of our device.

      Ambient light ray-tracing render of our prototype for fun.
      Ambient light ray-tracing render of our prototype for fun.

      Why This Matters

      Using Raysect for this project completely changed how I think about ray tracing. Instead of generating photorealistic images or making games, I was generating data; physically accurate estimates of irradiance and power density inside our prototype.

      This will eventually feed into our LTSpice (electronic circuit simulator) circuit models, where we’ll simulate the photodiodes response and calculate the system’s signal-to-noise ratio (SNR). It bridges optics and electronics in a neat way; tracing photons in Python to predict voltage in SPICE.

      Closing Thoughts

      It’s pretty wild that a technique originally built for rendering movie scenes is now helping us model real-world physics.

      For our project, Raysect gave us confidence that our NIR excitation setup would actually deliver enough power to elicit a measurable fluorscence signal; long before the first prototype or nanosensor was built.

      It’s one of those rare tools that sits perfectly between engineering and art; the kind of stuff that I deeply enjoy.