AI ball tracking solved

AI ball tracking is now working really well. I think it is on par with competitors (Veo, Trace, Pixellot, etc). Here is a video cropped to 4k:

This is another one but cropped to 2880x1440. The AI is still able to focus on the ball most of the time despite of the smaller area. When the ball is not visible, it is usually a result of unimportant actions (throw-in, dead ball, etc).

If you like your video AI tracked, please post links below.


This is awesome! Will this be a feature brought to the main software or will it always need to be on a submission basis? Thanks!

This looks really good. Are there any plans to incorporate perspective view in tracking/software? That would be the final piece in the jigsaw for me. Thank you.

1 Like

I’d be very interested to hear about the release of this feature. This also would be a very valuable feature for me. I’d much prefer to give you money than give it to Veo.

@gchen, previously, you said your algorithm was tuned to the GoPro Hero 10. Is that still the case or can it handle video from other cameras?

I’ll probably make it a web service instead of including it in the current software.

Not at this time. With a small crop window like 2880x1440 the final video doesn’t look so much ‘curved’ any more so I think there is less need for the perspective view.

It is a completely new algorithm now and it should be able to handle video from other cameras much better.

Neat! Would this be applicable to other sports? I record ultimate (frisbee) games and it’s basically 7v7 on a soccer-sized pitch throwing a disc. Would your algo be able to track that?

1 Like

I’m not sure if it would work. If you have recorded a video for a frisbee game with ActionStitch, can you share with me so that I can give it a try?

FTR I am also interested in ultimate frisbee support. I just ordered a GoPro 12 with the Max Lens Mod 2.0. I will first see how well it performs before getting a 2nd one to use with ActionStick. My primary use as is tactical analysis, where a fixed view is actually preferred.

However it would also be sweet to get dynamic footage generated out of this, so this is looking amazing. I assume this solution should also work in the above mentioned single camera setup, right?

If it can then also take out GoPro HiLight Tags to generate an automatic highlight video I would be over the moon.

Just curious, why do you want to have ball tracking a web service instead of including it into the ActionStitch software? The videos for games can be large and take a while to upload.

The AI tracking algorithm requires a good GPU which not everyone has. Plus, supporting all different types of GPU architectures would be a headache…

Can you share more info please on what kind of GPU would be required for this and what speed/framerate can you achieve?

I am very interested in this feature as I’m currently working on a hobby project of mine where I’m trying to build my own custom portable rig based on NVIDIA Jetson platform and two 4K MIPI CSI-2 camera modules.

My final goal is to achieve direct streaming of soccer games which would require capturing, stitching, tracking, transforming to perspective view and finally streaming.

I’ve currently done only PoCs and research on partial functionalities like capturing, stitching, encoding and the findings are very good (30 FPS with two 4k sources by utilizing GPU with CUDA cores and hardware H.265 encoders).

Please message me if you can share more info or if you need more info from me.