Technician Exhibition - AI Demonstration
The University of Lincoln is proud to be part of the Technicians commitment to recognise, promote and otherwise support its technical specialists, to this end in September 2025 an exhibition of Technical teams work, projects and skills has been organised. As a member of the technical team I have agreed to contribute a demonstration and created an interactive piece focusing on AI, Game design, Immersive storytelling and generally to promote the work done under the Digital Aquarium banner.
To showcase some of the work being explored within the Digital Aquarium ethos I have created an interactive experience piece that includes multiple technologies and practices, this is an evolution of the Virtual Gallery concept created a few months prior but now including a 3D Metahuman representation of myself that can copy facial expressions via ARKit and an iPad connection alongside an AI driven Metahuman that can respond to voice or text prompts and answer questions or explore / expand on elements within the virtual environment.
In this blog I am going to briefly go over some the steps taken to create this experience with a full more in-depth tutorial video being made available in the near future.
Character Creator 4 Digital Double via Headshot plugin
Digital Double - CC to UE
To create my digital double I uploaded a photograph of my face into Microsoft co-pilot to generate a more stylised 3D representation of my facial features. This image was then taken into Character Creator 4 and using Headshot plugin converted into a 3D animated and rigged face.
This rigged model as it stood would not be usable within Unreal 5.6 without additional steps as such the head model was exported as a .fbx export into Unreal to later be turned into a DNA Identity to translate and generate a metahuman compatible rig ready for ARKit live link control.
This conformed the metahuman model to the mesh generated within CC4 ready for additional tweaks namely addition of hair and skin texture matching to original source bmp.
Towards the end of this process CC4 updated to CC5 which meant some of these steps had to be repeated and amended to allow for HD textures and more accurate simulation and bone rigging.
Virtual Gallery
The virtual gallery was created within Unreal 5 using a combination of FAB store assets and custom models made in Blender. Images, videos and textures were sourced from university website alongside custom made. Custom blueprints were made to enable additional features and controls. To drive the NPC Metahuman character ConvAI was installed and setup to drive AI personality and respond to microphone and text inputs, the information is sourced from University website and OpenAI network.
Pictures of the experience will be added to this blog post after the event on 10th September 2025