There are less than 500 automation-friendly instruments for discovery and production use. Getting them to work together in tandem to run a specific experiment is clunky. Some protocols simply can’t be end-to-end automated, meaning scientists have to manually move samples to and from automated workcells and manual point solutions, disrupting their workflow.
Lila Sciences is a company focused on building a scientific superintelligence platform. One of their workflows involves using the LabChip GX Touch, a device that characterizes proteins, but cannot be automated with software alone. Using the LabChip requires very fine motor control and working with non-standard consumables
Lila Sciences partnered with Medra to fully automate their workflow involving the LabChip GX Touch.
Within 3 months, Medra’s Physical AI platform worked to carry out the manual steps needed to run the machine autonomously. Medra’s Physical AI’s Instrument Agent uses vision-language-operation models to autonomously control the LabChip GX Touch’s software.
When something didn’t go as planned, such as a lid failing to open or a tube sitting slightly off-center, Medra’s computer vision models recognized the issue using visual feedback and recovered without stopping the run.
Our flexibility enables:
Scientists modified and executed protocols independently through Medra’s UI, without requiring any hands-on automation engineering.
Medra’s dedicated team of Forward Deployed Engineers were on call to troubleshoot and respond to new requirements, allowing Lila’s engineers to reallocate time toward other high-priority projects.
“I have over 10 years in lab automation across dozens of deployments with some of the biggest biotechs. Automating the LabChip GX was not considered viable, until we met the Medra AI team. This is by far the smoothest and fastest deployment I have experienced. Medra’s team is exceptional.” – Catherine Heywood