How Aurora uses Highlight for Troubleshooting and Analyzing Labeling Performance
Highlights:
• Auora stores more than 15,000 hours worth of sessions every week
• Implementation took only a few hours, and they were set up end to end within a week
• Aurora utilized Highlight’s ability to support HTML Canvas and WebGL recording
• Aurora estimates $100,000+ worth of savings every year thanks to Highlight
The Problem:
Aurora’s AI model requires millions of human annotations a week. Aurora wanted to monitor these annotations on a QA level, but standard logging and data monitoring tools weren’t working for them.
How Highlight Helped:
Highlight’s session replay tool allowed Aurora to automatically sync data from the labelers to the session, so that if there was any issue with data labeling, they could see the actual user journey when the issue happened. This saved time and made the process way more efficient than their past set up. Along with added logging and analytics, Aurora used Highlight to debug crashes and listen to labeler feedback to improve their process.
Aurora is a self-driving innovator with the core of its technology being the Aurora Driver. Their mission is to continuously improve intelligent autonomous driving systems that are designed to see, understand, and safely traverse the world around them. To do this, Aurora employs 100s of people who supervise the AI learning component. Their AI algorithm learns from human annotations, which in turn requires millions of human annotations per week.
After trying to use various logging, web tracking, and even custom tooling Aurora simply couldn’t keep up with the QA monitoring of these annotations. They hit a ceiling on what they could achieve. Monitoring workforces requires a combination of logging, recording and analytics interconnected and none of the off-the-shelf tools gave them enough insights to take things to the next level. This is when they turned to Highlight.
The two main reasons Highlight stood out were the simplicity of implementation and the power of the session replay tool.
“[Integration] was smooth. We were able to get Highlight working within hours using their SDK. Once we signed, the enterprise solution it took less than a week to get the system running end to end. ”
A particularly strong element of the session replay tool was Highlight’s ability to support HTML Canvas and WebGL recording. Aurora uses WebGL HTML technology to render complex 3D data from their vehicles in the web browser, allowing labelers to manipulate and annotate a view of the real world.
Once implemented, Highlight allowed Aurora to audit every annotation recorded by linking Highlight session data. Aurora also leveraged Highlight to understand labeler feedback and debug crashes.
Today, Aurora is storing more than 15,000 hours of sessions every week. They estimate that their QA monitoring has increased by more than 10% across the entire workforce which given their scale is around $100,000 worth of savings.
“We like the mix of basic and advanced features. You can get a lot by using the basic APIs and web interface, but you can build advanced logging if you trace your code further and access the data. We plan to integrate further with our process and build an integration with our data backend to build richer analytics. ”
Interested in getting set up in the same way Aurora did? Learn more about Highlight’s product offerings here.