Welcome to the official documentation for TensorFlow Lite Micro! This guide provides an in-depth look at how to get started with TensorFlow Lite Micro, which is designed for running machine learning models on microcontrollers.

快速开始

  1. 安装 TensorFlow Lite Micro
  2. 创建一个简单的模型
  3. 优化模型性能

术语解释

  • TensorFlow Lite Micro: A lightweight version of TensorFlow designed for microcontrollers.
  • Machine Learning Model: A mathematical model that allows computers to learn from data.

资源链接

示例

Here's a simple example of a TensorFlow Lite Micro model in use:

#include "tensorflow/lite/c/common.h"
#include "tensorflow/lite/micro/all_ops_common.h"
#include "tensorflow/lite/micro/kernels/micro_ops.h"
#include "tensorflow/lite/micro/micro_mutable_op_data.h"
#include "tensorflow/lite/micro/micro_error_reporter.h"
#include "tensorflow/lite/micro/micro_interpreter.h"
#include "tensorflow/lite/micro/micro_allocator.h"
#include "tensorflow/lite/micro/micro_utils.h"
#include "tensorflow/lite/micro/tensor_c.h"
#include "tensorflow/lite/micro/tensor_c_data.h"

// ... (code to initialize and run the model)

// Output tensor
tflite::Tensor* output_tensor = interpreter->GetOutputTensor(0);

// Process the model's output
for (int i = 0; i < output_tensor->dims->size; i++) {
  for (int j = 0; j < output_tensor->dims->data[i]; j++) {
    float value = tflite::micro::GetTensorData<float>(output_tensor)[j];
    // Do something with the value
  }
}

TensorFlow Lite Micro Architecture