Machine Learning

From Tizen Wiki
Revision as of 01:15, 4 February 2020 by Mzx (talk | contribs)
Jump to: navigation, search

Machine Learning in Tizen

There are two categories related with Machine Learning in Tizen: (a) Preloaded intelligence services and (b) Machine learning framework support. With (a), Tizen applications can call high-level APIs to invoke preloaded neural network models of Tizen. For example, "mv_face_detect()" in Media Vision APIs. With (b), Tizen applications may execute its own neural network models (e.g., TensorFlow-Lite's .tflite files) with MachineLearning.Inference.Single APIs or create its own neural network pipelines with MachineLearning.Inference.Pipeline APIs.

In this document, we describe the latter part (b).


Neural Network Pipelines

In 2018-2019, there were various proposals on creating neural-network systems based on stream pipelines. Major companies including Samsung (NNStreamer), Google (MediaPipe), and NVidia (DeepStream) have proposed very similar approaches, suggesting that most neural-network systems may be expressed as stream pipelines (or the pipe and filter style).

Tizen supports such neural network pipelines natively with its public APIs: MachineLearning.Inference.Pipeline, which provides simple interfaces to create NNStreamer pipelines.

NNStreamer pipelines are GStreamer pipelines. Although Tizen API has restrictions on the usage of GStreamer elements (we manage a whitelist of elements that can be used with NNStreamer APIs), a NNStreamer pipeline is a general GStreamer piepline. With NNStreamer, users may process tensor streams as if they are media streams and process them with general neural network frameworks just as MediaPipe does with media and tensor data streams. Note that while MediaPipe and NNStreamer handles tensors as the first class citizen of data streams, DeepStream does not; DeepStream considers tensors as metadata of conventional media data streams, which makes it a bit challenging if you are writing a serious machine-learning application or service although it makes it easier if you are writing a demonstration or visualization application.


Tizen's Machine Learning APIs

Tizen 5.5 (M2 / Released)

  • C API: SIngle, Pipeline
  • .NET API: SIngle
  • Web API: N/A

Tizen 6.0 (M1 / Plan)

  • C API: SIngle, Pipeline
  • .NET API: SIngle, Pipeline
  • Web API: Draft Only

How to write pipeline descriptions (WIP)

  • Refer to GStreamer for the pipeline syntax.
  • The elements that can be used in a Tizen ML pipeline are limited. The whitelist is available at: TBD (Whitelist draft)


Machine Learning Infrastructure in Tizen

Neural Network Framework Support @ 5.5 M2

  • Tensorflow-Lite 1.13: Supported via MachineLearning.Inference.*

Neural Network Framework Support @ Tizen:Unified (daily build)

  • Tensorflow-Lite 1.13: Supported via MachineLearning.Inference.* / We may upgrade it to 1.15.x for 6.0 M1.
  • Tensorflow 1.13: Available at devel:AIC:Tizen:5.0:nnsuite:test for x86_64 only.
  • Caffe: informally tested (/platform/upstream/caffe) Not Available at Tizen:Unified
  • CaffeOnACL: Available at Tizen:Unified
  • Caffe2/PyTorch: informally tested (/platform/upstream/pytorch) Not Available at Tizen:Unified
  • Google-Coral EdgeTPU Runtime: Available at Tizen:Unified. Accessible via MachineLearning.Inference APIs
  • OpenVINO w/ NCS: Available at Tizen:Unified. Accessible via MachineLearning.Inference APIs
  • Intel NCSDK2 w/ NCS: Available at Tizen:Unified. Accessible via MachineLearning.Inference APIs

Robotics Support