Welcome back!
This week, we do a deep dive into computer vision + surgery, and highlight one paper of the week in surgical aritifical intelligence.
Table of Contents
👀 Can computer vision usher in an era of virtual assistants in the OR?
🏆 Paper of the week - Spine Surgery Predictive Models: Crystal Ball 🔮 ?
🎺 Best of twitter
Computer Vision: less autopilot, more co-pilot
How can cameras and high throughput computing help in and out of the operating room? Let’s start off with two simple truths:
1. Operative microscopes, endoscopes and laparoscopes can record can record video with the push of a button.
2. Computers can be taught to "see" video and make predictions- that is how self driving cars work (some of them, like Tesla’s TSLA 0.00%↑ self-drive, rely only on video)
Despite salacious claims that seem too good to be true - “Would YOU let a robot operate on you alone?” - the immediate impact of computer vision on the operating room will begin with automation of pre- and post-surgical workflows.
Surgeons, how many hours have you spent editing operative video and pairing down 3-4 hours of surgical footage into 2 minutes? This is surgical content creation! Imagine an AI tool which took your video(s) and pieced together the highlights, auto generated captions, or even generated a script based on the operative report. Look at this TikTok** showing how AI could generate short form content from long form video.
AI won’t be able to give you a beautiful final product, but it can automate the first 50% of the work, and do a pretty good job with the next 30% of the work
Will giving surgeons a “Highlights” tool fundamentally alter how surgical care is delivered? Probably not. Might it help buy back a few hours every month that would be spent frustratingly editing video the night before a conference talk? Definitely. Seems like a worthy endeavor to me.
Let’s look at an example inside the OR.
In manufacturing, computer vision ensures bolts go on in sequence and according to standard. These applications serve as a real-time event log, provide guidance and identify deficient or skipped steps.
In surgery, automation can assist team members who may need to “read the surgeon’s mind” to know what supplies to open. Opening excess instrumentation or disposables is (at its most benign) a massive waste of resources, while forgetting to open critical items can cause delays (at $30-$40/minute, typically), or influence outcome. Checklists have been popularized for good reason - what if we used vision-based systems to automate the intraop checklist?
**that’s 2/3 articles with a Tik Tok reference, for anyone keeping score.
🏆 Best Paper Award: Spine Surgery Predictive Models- Do Our Methods Make Sense and Does it Even Matter?
Kudos to this multi-institutional team for doing what seems very rare these days and reproducing methods of other works (1) and then building upon it.
Developing datasets for machine learning applications across a long timespan mean that models will train on patients who received significantly different care than patients would today. For oncologic care, that timeline could be as short as a few years given the constant innovation in antineoplastic agents
This might lead to our model being able to predict outcomes very well on patients 10 years ago, but not today. As a result, the authors propose splitting the data by date and validating on a recent cohort of patients, to allow performance evaluation on a recent patient.
In this case, they measured predicted one year survival of patients with metastases to the spine. In this study it didn’t seem to make much of a difference based on splitting or not splitting based on time. But nonetheless a very interesting point which likely poses truths in fields where techniques or medications are rapidly evolving.
It did spark an interesting question though:
Send us your thoughts - how humans and AI work together is its own entire field of study and one we have just broken the surface of.
(1) Azad et al.
Best of Twitter 🟦☑️
Speaking of surgeon-AI teaming… people do the best when their boss gives them AI generated feedback without telling them it’s AI generated.
Meta has used natural language processing to uncover protein structures
Feeling inspired? Drop us a line and let us know what you liked.
Like all surgeons, we are always looking to get better. Send us your M&M style roastings or favorable Press-Gainey ratings by email at ctrl.alt.operate@gmail.com