You are here: Home » NewsFeeds » Announcing TensorFlow Lite

Announcing TensorFlow Lite

Today, we’re happy to announce the developer preview of TensorFlow Lite, TensorFlow’s lightweight solution for mobile and embedded devices! TensorFlow has always run on many platforms, from racks of servers to tiny IoT devices, but as the adoption of machine learning models has grown exponentially over the last few years, so has the need to deploy them on mobile and embedded devices. TensorFlow Lite enables low-latency inference of on-device machine learning models.It is designed from scratch to be:
Enables inference of on-device machine learning
models with a small binary size and fast initialization/startup
A runtime designed to run on many different
platforms, starting with Android and iOS
Optimized for mobile devices, including dramatically
improved model loading times, and supporting hardware acceleration

More and more mobile devices today incorporate purpose-built custom hardware to
process ML workloads more efficiently. TensorFlow Lite supports the Android
Neural Networks API to take advantage of these new accelerators as they come

TensorFlow Lite falls back to


Original article