Blog Article Gadgets Gaming MMORPG MOBA Motorcycle News Photograph Science Strategy

These OpenCV AI camera modules put the power of computer vision in anyone’s hands for under $150

It’s been twenty years since OpenCV started, aimed at making an open-source, common infrastructure for computer vision. Perhaps that anniversary is the perfect time for the OpenCV Artificial Intelligence Kit (OAK) to release two 4K/30fps spatial AI camera modules, which can do their processing on-device vs. pulling those resources from a cloud.

Each module has built-in chips for artificial intelligence processing, so they won’t lose precious time sending data off to a remote server for detection. That could be the crucial difference between detection of an object, or reading a license plate before the car speeds off, or whatever else you are trying to do with the camera.

Image: Luxonis

The company says that they’re “absurdly easy to use,” with the ability to get up and running in under 30 seconds. OAK-1 uses a singular USB-C port for both data and power, while the more powerful OAK-D requires a 5V cable, making confusing setups a thing of the past. The OAK units ship with multiple neural nets, for things like mask/no-mask detection, emotions recognition, facial landmarks, pedestrian detection, and vehicle detection. The team will be adding to this list as time goes on, or you can upload your own trained models to the devices.

OAK-1 can do lossless motion-based zoom when it detects moving objects, and OAK-D can do stereo depth, 3D object localization, and object tracking in 3D space. Nifty.

The company tells KnowTechie that they’ve already passed their crowdfunding goal and has gone over $100K, with over 500 backers. Whew, AI tech is hot nowadays.

If you’re interested in grabbing either of the two OAK cameras, head on over to Kickstarter where you can still get the OAK-1 for $79 or the OAK-D for $149.

Have any thoughts on this? Let us know down below in the comments or carry the discussion over to our Twitter or Facebook.

Editors’ Recommendations: