Digital twins will change the face of medicine and agritech

July 19, 2024

Like a scene out of Iron Man II, Jensen Huang took to the stage at 2024’s GTC conference to reveal an impressive, if not slightly intimidating, line of humanoid robots. But what personally blew me away was when not a minute later, a little droid waddled on to stage… almost literally straight out of the Star Wars universe.

It had character; behaved like it was sentient. And most importantly, it was cute.

Following my fascination with it, I went down a rabbit hole into looking at the current state of autonomous intelligence. And I found that thanks to recent breakthroughs in robotics and machine intelligence, the ways we interact with and utilize autonomous agents are changing drastically - and for the better - offering solutions to some of the most compelling modern day challenges in sectors from entertainment to scientific research alike.

But I couldn’t stop thinking - is there anything that could be learned with the introduction of autonomous biological agents?

The ways we interact with and utilize autonomous agents are changing drastically - and for the better […] in sectors from entertainment to scientific research alike.

Autonomous agents with advanced AI models

With examples like the Star Wars robot, or Boston Dynamics' Atlas 2, we already have impressively autonomous robots successfully entering the zeitgest - and, in the case of the former, the real world… in full scrutiny of the public. (Or the full scrutiny of Disney World-goers, anyhow.)

If you’ve been in the field a while, you’d know machine learning-based movement is not new in robotics. Take, for instance, MIT’s mini cheetah - the fastest four-legged robot on the Earth, which used a machine learning model to learn how to run at ~10 mph. And since the release of advanced AI models and breakthroughs like NVIDIA’s collaboration with Disney on the aforementioned adorable little Star Wars droid, several other promising applications for machine learning-based locomotion have been researched and developed - and they're drastically speeding up state-of-the-art simulations of real organisms.

Unlike their endearing droid-themed counterparts however, most of these AI models are still constrained to the lab. Google DeepMind's digital twin of a rat may not be waddling around a theme park for our entertainment, for instance. But it is speeding up our understanding of neural control and motor function, and it's exciting to see how this will be applied to neuroscience and robotics:

(Video from Diego Aldarondo, researcher at Fauna Robotics | Google DeepMind)

As briefly touched on in the video - and much more so in DeepMind’s full article in Nature - the digital twin model can create a virtual representation of a biological system's motor control using advanced machine learning techniques, vaguely similar to the technology that powers large language models like ChatGPT. Except, in place of a large language model (LLM) such as the one powering ChatGPT, this approach instead deploys a different type of Artificial Neural Network (ANN) to transform desired 3D movement trajectories and sensory information into the motor commands required to execute that movement. This is known as an inverse dynamics model.

…If that went over your head, don’t worry. In layman’s terms, the digital twin of the rat uses an artificial neural network to simulate the biological neural network of the rat’s brain. And the fact that it learned to do this from just a load of data points alone is fascinating.

In layman’s terms, the digital twin of the rat uses an artificial neural network to simulate the biological neural network of the rat’s brain.

In essence, whilst ChatGPT can speak English, these AI models learned to "speak" the language of motor control: predicting neural activity in the striatum and motor cortex, mimicking how our brains translate high-level movement plans (like reaching for a coffee cup) into the specific motor commands needed to execute those movements.

Indeed, when tested with real rats performing the same behaviors as the virtual model, the ANNs controlling the virtual rat showed activity that closely related to the neural activity in the brains of real rats. This approach could therefore be applied to study various aspects of neuromotor control that are challenging to deduce experimentally.

From the abstract of their paper:

We built a 'virtual rodent', in which an artificial neural network actuates a biomechanically realistic model of the rat in a physics simulator […] we found that neural activity in the motor cortex was better predicted by the virtual rodent's network activity than by any features of the real rat's movements.

[…] These results demonstrate how physical simulation of virtual animals can help interpret the structure of neural activity across behavior and relate it to theoretical principles of motor control.

This method, if further developed and adapted, could potentially open up new possibilities for studying complex aspects of neurological disorders and drug interventions. By providing a framework to simulate neural control of movement, similar approaches could theoretically be adapted to model how different interventions affect motor symptoms in neurological disorders. For instance, future models might help researchers investigate the impact of various drugs on motor function in motor diseases like Parkinson's or ALS.

Will digital twins revolutionize healthcare and agriculture?

As many of us know, AI in biological research means living systems can be modelled and simulated at a tremendous speed like never before. We can now independently model core biological processes, such as metabolism and cell signalling - defining how we can create limited digital twins of real-life organisms for healthcare and agricultural applications.

To elaborate on what I mean; a lot of time is spent in hospitals and clinics performing diagnostic tests and monitoring patient conditions. Perhaps, decades in the future from now with AI-powered digital twins, we could potentially simulate patient responses to treatments or predict disease progression, allowing healthcare professionals to make more informed decisions quickly.

Similarly, in agriculture, digital twins of crops could revolutionize farming practices. By creating virtual models of plants that respond to different environmental conditions, we could optimize crop yields, reduce water usage, and better prepare for climate change impacts.

As the costs of developing and implementing these digital twin technologies drop, their applications will no longer be limited to mega-corporations like Disney, and will begin to trickle down to agribusiness and hospitals. Modernized and active communities such as small-scale farmers' cooperatives or regional healthcare systems could play a drastic role in adopting digital twin solutions, potentially democratizing access to cutting-edge agricultural and medical technologies.

More Articles

End of Q2 2025 Update
July 9, 2025
We’ve officially wrapped Q2 2025 up, and it’s time to reflect on one of the most dynamic quarters in ValleyDAO's journey so far.
Phlo, Part 2: Solving Deep Tech’s Broken Pipeline
June 12, 2025
This is Part 2 of our breakdown on why the deep tech startup model is fundamentally broken and what we’re doing about it.
Phlo, Part 1: Why Deep Tech Fails (and How We're Fixing It)
June 3, 2025
Research, Revenue & The Definition of Insanity