The goal of this article is to make it easier to create your own sign language translation app. However, this article isn’t a programming tutorial. You will need to be comfortable with:
Being comfortable with programming (C++, Python) is not something that will happen overnight, but over a month or two of experimenting and learning. When it comes to TensorFlow/Keras, it can be learned with a few example datasets and some YouTube tutorials. Just give it a day or two of learning.
To begin, you must first have the following…
This article will describe much of what I’ve learned about the already made calculators in Mediapipe over the past year of using it. I’ve exclusively used the hand tracking example, so that is where I will be detailing. The documentation on these calculators exists within the files, but I’ve found it insufficient. Here, I will be breaking down calculators and classes commonly used by hand tracking. This article is made for those who have intermediate knowledge of how Mediapipe functions.
The “TfLiteInferenceCalculator” is used to run .tflite files. Setup the .pbtxt file as following:
node {
calculator: “TfLiteInferenceCalculator”
input_stream: “TENSORS:input_tensors”
…
Mediapipe is an open-source framework to “build word-class machine learning solutions” by Google — currently in the alpha stage. It has been open-sourced for a year now but has likely been under development for far longer. A key “selling” point (it’s free) of Mediapipe is that the code is written in c++, but it can easily be deployed to any platform, from web assembly to Android to MacOS.
When it first released, Mediapipe had only a few demos, but now their GitHub page boasts almost a dozen different demos from persistent object tracking, AR hair coloring, to pose tracking that…
Enjoys programming