With the growing demand for AI-powered mobile applications, developers are increasingly looking for ways to integrate machine learning models into Flutter apps.
In this article, I’ll walk you through How to integrate TensorFlow Lite models into your Flutter project
TensorFlow Lite (TFLite) and Flutter together enable efficient on-device machine learning for mobile applications. TFLite provides a lightweight, optimized framework for running models on mobile devices,
How to choose and convert a model to TFLite
Setting up Flutter and integrating TFLite
Running inference on-device
A sample use case: Real-time image classification
Before you can use AI in your Flutter app, you need a model. You have two options:
Use Pre-trained Models: TensorFlow Hub offers models like MobileNet (image classification), SSD MobileNet (object detection), and more.
Train Your Own Model: Use TensorFlow or Keras to train a custom model, then convert it to the TFLite format.
To convert a .pb or .h5 model to TFLite:
import tensorflow as tf
converter = tf.lite.TFLiteConverter.from_saved_model('saved_model_directory')
tflite_model = converter.convert()
with open('model.tflite', 'wb') as f:
f.write(tflite_model)
Add the tflite_flutter and tflite_flutter_helper packages to your pubspec.yaml file:
dependencies:
flutter:
sdk: flutter
tflite_flutter: ^0.10.0
tflite_flutter_helper: ^0.4.0
Then, run:
flutter pub get
Place the .tflite model file inside the assets folder and declare it in your pubspec.yaml.
flutter:
assets:
- assets/model.tflite
Load the model in your Dart code:
import 'package:tflite_flutter/tflite_flutter.dart';
late Interpreter interpreter;
void loadModel() async {
interpreter = await Interpreter.fromAsset('model.tflite');
}
Use tflite_flutter_helper to help with data conversion and image preprocessing.
Example for image classification:
import 'package:tflite_flutter_helper/tflite_flutter_helper.dart';
// Resize, normalize, and convert image to tensor
TensorImage inputImage = TensorImage.fromFile(imageFile);
ImageProcessor imageProcessor = ImageProcessorBuilder()
.add(ResizeOp(224, 224, ResizeMethod.BILINEAR))
.add(NormalizeOp(0, 255)) // Normalize to 0–1
.build();
inputImage = imageProcessor.process(inputImage);
interpreter.run(inputImage.buffer, output);
Once you receive the output tensor, map it to human-readable labels and display results in your UI:
List<String> labels = await FileUtil.loadLabels("assets/labels.txt");
int maxIndex = output.indexWhere((score) => score == output.reduce(max));
print("Prediction: ${labels[maxIndex]}");
Image Classification: Identify objects in photos using MobileNet.
Pose Detection: Analyze human poses for gaming or fitness apps.
Speech Recognition: Transcribe speech using on-device voice models.
Natural Language Processing: Use BERT-based models for sentiment analysis or smart replies.
Optimize models using quantization to reduce size and improve speed.
Perform heavy processing in isolates or background threads.
Always handle edge cases like missing permissions and incompatible devices.
With TensorFlow Lite and Flutter, bringing AI capabilities into your mobile apps is no longer a distant dream. Whether you're building a health assistant, smart camera, or a voice-driven chatbot, this stack gives you the tools to make it happen—right on the device, offline and fast.
Start experimenting today, and unlock the next generation of mobile experiences powered by AI.