Xnor launches embedded AI platform AI2Go

Xnor.ai today launched AI2Go, a platform for developers and manufacturers to make pre-built AI models optimized for on-device artificial intelligence. AI2Go is designed for state-of-the-art edge computing in devices like cameras, drones, and sensors.

The platform comes with hundreds of models made especially for smart home, security, auto, entertainment, and surveillance devices. The service was built to remove a need to worry about challenges that can arise when attempting to make AI for edge use cases like latency, power consumption, or a limited amount of available memory.

Models can be made with a few clicks and lines of code, and constraint settings tuned to manage things like memory usage. Models are also customized for various use cases and infused with an inference engine.

“With version zero people can specify these constraints and get a model and download it all of those models are already pre-trained they just need to grab it and use it,”Xnor CEO Ali Farhadi told VentureBeat in a phone interview. “Version 1 will enable functionalities to let people bring their own training data for custom models, and with the second version developers will be able to bring in already trained model and optimize them for the edge.”

Above: AI2Go

Embedded AI has grown in popularity as a way to deploy intelligence without cloud or internet connection and to ensure user privacy. Smaller models can also allow developers and manufacturers to consider lower cost or commodity hardware for their devices.

Earlier this year, Xnor demonstrated that it can create a computer vision model small enough to fit on an FPGA chip powered by a single solar cell.

Xnor will continue to offer enterprise services for manufacturers and customers. AI2Go models will come with free evaluation license agreements.

A number of hardware and software solutions for edge computing have been introduced in recent months such as Nvidia’s Jetson Nano — its lowest cost Jetson edge AI chip to date — in March. Qualcomm introduced its Cloud AI 100 chip for edge inference in April, and in March, Google launched TensorFlow Lite 1.0 for embedded devices.

Leave a Reply

Your email address will not be published. Required fields are marked *