Android quickstart for tensorflow lite

https://github.com/tensorflow/examples/tree/master/lite/examples/image_classification/android

 

摘自 https://tensorflow.google.cn/lite/guide/android

o get started with TensorFlow Lite on Android, we recommend exploring the following example.

Android image classification example

Read TensorFlow Lite Android image classification for an explanation of the source code.

This example app uses image classification to continuously classify whatever it sees from the device's rear-facing camera. The application can run either on device or emulator.

Inference is performed using the TensorFlow Lite Java API and the TensorFlow Lite Android Support Library. The demo app classifies frames in real-time, displaying the top most probable classifications. It allows the user to choose between a floating point or quantized model, select the thread count, and decide whether to run on CPU, GPU, or via NNAPI.

 

Build in Android Studio

To build the example in Android Studio, follow the instructions in README.md.

Create your own Android app

To get started quickly writing your own Android code, we recommend using our Android image classification example as a starting point.

The following sections contain some useful information for working with TensorFlow Lite on Android.

Use Android Studio ML Model Binding

To import a TensorFlow Lite (TFLite) model:

  1. Right-click on the module you would like to use the TFLite model or click on File, then New > Other > TensorFlow Lite ModelAndroid quickstart for tensorflow lite

  2. Select the location of your TFLite file. Note that the tooling will configure the module's dependency on your behalf with ML Model binding and all dependencies automatically inserted into your Android module's build.gradle file.

    Optional: Select the second checkbox for importing TensorFlow GPU if you want to use GPU accelerationAndroid quickstart for tensorflow lite

  3. Click Finish.

  4. The following screen will appear after the import is successful. To start using the model, select Kotlin or Java, copy and paste the code under the Sample Code section. You can get back to this screen by double clicking the TFLite model under the ml directory in Android Studio. Android quickstart for tensorflow lite

Use the TensorFlow Lite Task Library

TensorFlow Lite Task Library contains a set of powerful and easy-to-use task-specific libraries for app developers to create ML experiences with TFLite. It provides optimized out-of-box model interfaces for popular machine learning tasks, such as image classification, question and answer, etc. The model interfaces are specifically designed for each task to achieve the best performance and usability. Task Library works cross-platform and is supported on Java, C++, and Swift (coming soon).

To use the Support Library in your Android app, we recommend using the AAR hosted at MavenCentral for Task Vision library and Task Text library , respectively.

You can specify this in your build.gradle dependencies as follows:

 
dependencies {
    implementation 'org.tensorflow:tensorflow-lite-task-vision:0.1.0'
    implementation 'org.tensorflow:tensorflow-lite-task-text:0.1.0'
}

To use nightly snapshots, make sure that you have added Sonatype snapshot repository.

See the introduction in the TensorFlow Lite Task Library overview for more details.

Use the TensorFlow Lite Android Support Library

The TensorFlow Lite Android Support Library makes it easier to integrate models into your application. It provides high-level APIs that help transform raw input data into the form required by the model, and interpret the model's output, reducing the amount of boilerplate code required.

It supports common data formats for inputs and outputs, including images and arrays. It also provides pre- and post-processing units that perform tasks such as image resizing and cropping.

To use the Support Library in your Android app, we recommend using the TensorFlow Lite Support Library AAR hosted at MavenCentral.

You can specify this in your build.gradle dependencies as follows:

 
dependencies {
    implementation 'org.tensorflow:tensorflow-lite-support:0.1.0'
}

To use nightly snapshots, make sure that you have added Sonatype snapshot repository.

To get started, follow the instructions in the TensorFlow Lite Android Support Library.

Use the TensorFlow Lite AAR from MavenCentral

To use TensorFlow Lite in your Android app, we recommend using the TensorFlow Lite AAR hosted at MavenCentral.

You can specify this in your build.gradle dependencies as follows:

 
dependencies {
    implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly-SNAPSHOT'
}

To use nightly snapshots, make sure that you have added Sonatype snapshot repository.

This AAR includes binaries for all of the Android ABIs. You can reduce the size of your application's binary by only including the ABIs you need to support.

We recommend most developers omit the x86, x86_64, and arm32 ABIs. This can be achieved with the following Gradle configuration, which specifically includes only armeabi-v7a and arm64-v8a, which should cover most modern Android devices.

 
android {
    defaultConfig {
        ndk {
            abiFilters 'armeabi-v7a', 'arm64-v8a'
        }
    }
}

To learn more about abiFilters, see NdkOptions in the Android Gradle documentation.

 

上一篇:JQ模拟六位密码支付效果


下一篇:【零基础-2】PaddlePaddle学习Bert