Dissertation
My Virtual Horse: An AR Experience
The Brief
A Welsh horse club wanted to explore whether XR technology could help people experience their horses remotely. The long-term vision was a full interactive experience; the realistic first step was a proof of concept — an AR environment where a user wearing a Meta Quest headset could see a realistic horse model in their physical space.
What I Built
A realistic 3D horse model of a specific Welsh Pony and Cob, modelled in Blender using quad-based topology. I visited the client’s stables to photograph the actual horse from multiple angles, then used those reference photos to trace, build, and texture the model. The topology was designed with animation in mind — denser mesh around the face and ears where horses are most expressive, cleaner flow along the joints for natural deformation.
A textured, rigged, animated model. UV unwrapped, texture-painted using the reference photos as stencil overlays, rigged using Rigify (after comparing it against a hand-built skeleton), and animated with a looping idle animation — ear twitches, head movement, the small things that make a static model feel alive.
An AR environment in Unity configured for Meta Quest passthrough. This meant stripping out Unity’s default post-processing (volumetric lighting, HDR), setting up XR-specific camera tracking, plane detection, and anchor management, and solving the various import conflicts between Blender’s export format and Unity’s expectations.
What Didn’t Work
Hair. Blender’s hair particle system doesn’t export cleanly to Unity. The strand-based approach that works for pre-rendered 3D simply isn’t practical for real-time AR on mobile hardware.
Fur. Shell-method fur shaders designed for Unity’s Universal Render Pipeline partially worked — but only rendered for one eye. AR requires stereo rendering, and the shader wasn’t written to handle that. The horse appeared fully furred in the left eye and invisible in the right. This is a genuine unsolved edge case in AR development — most fur shaders assume mono rendering.
Device deployment. The project runs correctly in Unity’s Play Mode on the Meta Quest via developer mode. However, building directly to the headset as a standalone app failed — Unity wouldn’t recognise the Quest as a build target. The AR experience works, but only tethered to a development environment.
What I Learned
Modelling organic forms is a different discipline than modelling objects. A cube is simple shapes. A horse is continuous curves, flowing topology, and every edge decision affects how the model deforms when animated. I iterated the model multiple times, and each version taught me something the previous one couldn’t.
Preparation matters more than skill. The reference photos from the stable visit were the single most valuable asset in the project. Without them, I’d have been guessing at proportions and textures. With them, I could paint directly from reality.
Knowing what doesn’t work is as valuable as knowing what does. The hair and fur failures aren’t just failures — they’re documentation of where the current toolchain breaks down for AR. That’s useful information for anyone attempting similar work.
Client work requires managing expectations. I’m naturally ambitious, and I had to learn to say “that’s outside the scope” without it feeling like giving up. The client wanted a full interactive experience. What I could deliver was a strong foundation and an honest assessment of what’s possible now versus what needs more development.
Technologies
Blender (modelling, texturing, rigging, animation), Unity (AR environment, XR integration), C# (Unity scripting), Meta Quest (hardware target), Meta XR SDK (passthrough, tracking)