Massoud Mazar

Sharing The Knowledge

NAVIGATION - SEARCH

Mobile Sensors: Easy data collection, labeling and model deployment

Disclaimer: My targets for this article are data scientists which may not be necessarily coming from a software engineering background. Pardon me if you find this over simplified.

Mobile devices provide a rich set of sensors to allow us get a feel of where the device is being used and how. Sensors can tell us about environment, motion and orientation of the device, among other things. A list of sensors supported by Android can be found here.

There are a lot of cool applications for data coming from these sensors and a lot of those applications could benefit from machine learning models to infer more meaning from the sensor data. A famous example is the classic Human Activity Detection using mobile phones which can be found easily on the internet. But what if you need to collect data and label it for a different purpose? Here I show an easy way to build a mobile app which can run on both Android and iOS for both data collection and testing of the trained model.

Ionic Framework

There are a few good frameworks which allow you to easily create an app which can be deployed to both Android and iOS. I picked Ionic because of my previous knowledge of AngularJS and JavaScript (which are used to build apps in Ionic), but it also lets you use React and Vue. You will need to have some knowledge of these JavaScript based frameworks to be able to use Ionic.

Preparations

Before looking at a code example, we need to get our system setup. I used a new mac for this article so I find most of the missing pieces. Ionic documentation is very easy to follow and explains how to setup your development environment:

First Try

Now that your environment is setup, create a simple one page app and see if you can get it to run on the computer and then on a device. Following command (which you will run using terminal) creates a directory (folder) in your current directory with name "sensors" which is based on "blank" template:

ionic start sensors blank

After the app is created, change directory to the newly created directory and use ionic to serve your app in a browser:

cd sensors
ionic serve

This should open a browser showing your simple one page app. You can hit CTRL+C to stop the ionic server.

I suggest you run the same simple app on an emulator or physical device (like a phone) to make sure your environment is fully setup before you dive into a lot of coding. Steps are explained in the ionic documentation, but I can show what I did to get the app deployed to my Android phone.

First step is to build the app:

ionic build

After the build is complete with no errors, you need to do a one time setup of the target OS, for example, add Android support to your new app using this command:

ionic capacitor add android

I chose "capacitor" but you also have the choice of "cordova". I let you read up on what is the difference.

Now that your app has support for Android, every time you make s change and want to deploy to Android, you will use the following commands:

ionic build
ionic capacitor copy android
ionic capacitor open android

First command builds the code to include your recent changes. Second command copies the build results to the target Android folders and third command opens Android Studio for you. This is how "capacitor" works which is more convenient for me and that is why I chose capacitor. See the above link for cordova instructions.

Now that our code is loaded in Android Studio, we can connect a device the the computer which will be detected by Android Studio and then allows us deploy and run the app on the device.

Collecting Sensor Data

If you have followed the steps above, you should already have an app called "sensors" based on the "blank" template. I will be using this app as the basis for the next sections. All the modifications I'm referring to are done to files located under "src/app/home" directory.

Simple UI

Since we are not building a fancy UI for end users and the goal here is to collect sensor data for internal use, I use some simple HTML based UI elements (mostly buttons) to let me control the behavior of the app. Open "home.page.html" file in your IDE/Editor and change the content to following:

<ion-content>
  <br />
  <ion-button color="success" (click)="startCollect()" [disabled]="isRunning">Start</ion-button>
  <ion-button color="danger" (click)="stopCollect()" [disabled]="!isRunning">Stop</ion-button>
  <div style="height:100%;width:100%" (touchstart)="touchstart($event)" (touchend)="touchend($event)">
  </div>
</ion-content>

Above HTML-like code defines a contents section, which contains 2 buttons and a "div" which covers all of the screen. Buttons are used to start and stop the data collection, and color coded using "color" attribute. It should be obvious that "click" event of each button will call relevant function, and "disabled" attribute of each button depends on value of a variable called "isRunning".

An empty line break (br) is added before the buttons to make sure they are not stock behind the notch which most new phone have on top section of the screen, and the "div" is there to pick up any touch events on the screen. The idea is, when we want to label our data for a certain activity, we can tap anywhere on the screen. For the simplicity, I'm only labeling for one single activity, but if your scenario involves more label types, feel free to modify the UI to allow that.

Since this app will potentially be running for long periods, it makes sense to save battery as much as possible. To help with that effort, we can change the background color of the app to semi-black. Let's add the following to "home.page.scss" file:

ion-content{
    --ion-background-color:#111D12;
}

Insomnia

One more small preparation before getting to the main functionality. I needed to make this app prevent my phone screen from going to sleep, so I use a native ionic component called Insomnia. To add this component to your app, run the following command in terminal:

npm install @ionic-native/insomnia

Data Collection Logic

All the logic which drives the app is added to "home.page.ts" which is written in TypeScript. Consider TypeScript as modern JavaScript.

In top part of the code, we import the libraries we will be using and we define plugins and providers:

import { Component } from '@angular/core';
import { DatePipe } from '@angular/common'
import { Insomnia } from '@ionic-native/insomnia/ngx';

import { Plugins,
  FilesystemDirectory,
  FilesystemEncoding
} from '@capacitor/core';

const { Motion } = Plugins;
const { Filesystem } = Plugins;

@Component({
  selector: 'app-home',
  templateUrl: 'home.page.html',
  styleUrls: ['home.page.scss'],
  providers:[DatePipe, Insomnia]
})
export class HomePage {
...

Let's define some module level variables:

export class HomePage {
  runMode = "";
  isRunning = false;
  dirName = "sensordata";
  sensorFilename = "";
  touchFilename = "";
  sensorData = "";
  dataCounter = 0;
  accelEventListener:EventListenerOrEventListenerObject = null;
  orientEventListener:EventListenerOrEventListenerObject = null;
  currentOrientation:DeviceOrientationEvent;

Above variables are:

  • runMode: used later when we add ability to run the trained model in inference mode
  • isRunning: the variable which you saw used in UI to toggle the "disabled" attribute of buttons
  • dirName: directory name where we store the collected data on the device. It will be under "Documents"
  • sensorFilename: file name for sensor data
  • touchFilename: file name for touch (label) data
  • sensorData: intermediate string buffer to store data in memory before saving to file
  • dataCounter: counter to keep track of how much data is stored in memory
  • accelEventListener: pointer to event listener for accelerometer and some other sensors
  • orientEventListener: pointer to event listener for orientation sensor
  • currentOrientation: data for current orientation of device

I had to use a separate variable to store current orientation because orientation data arrives with lower frequency (5 Hz) than the other sensors (60 Hz).

For the providers we will use in this module, we need to pass them as parameters to the constructor:

  constructor(public datepipe: DatePipe, private insomnia: Insomnia) {
    // make sure directory exists
    this.checkDirectory();
  }

In above constructor, we are calling a function to make sure our target directory exists before we try to write data to it. It will create the directory under "Documents" (if it does not exist). Here is the definition of the function:

  async checkDirectory() {
    try {
      let ret = await Filesystem.readdir({
        path: this.dirName,
        directory: FilesystemDirectory.Documents
      });
    } catch(e) {
      try {
        let ret = await Filesystem.mkdir({
          path: this.dirName,
          directory: FilesystemDirectory.Documents,
          createIntermediateDirectories: false // like mkdir -p
        });
      } catch(e) {
        console.error('Unable to make directory', e);
      }
    }
  }

We have a "Start" button on our UI which is going to call the "startCollect()" function. This is where we setup our event listeners and start collecting and saving the sensor data:

  startCollect(mounted) {
    this.insomnia.keepAwake();
    this.runMode = "collect";
    this.isRunning = true;

    this.touchFilename = this.getFileName() + ".touch.csv";
    this.createFile(this.dirName + "/" + this.touchFilename);

    this.sensorData = "";
    this.dataCounter = 0;
    this.sensorFilename = this.getFileName()  + ".accel.csv";
    this.createFile(this.dirName + "/" + this.sensorFilename);
    this.accelEventListener = (event: DeviceMotionEvent) => {
      var accelerationIncludingGravity=event.accelerationIncludingGravity;
      var rotationRate=event.rotationRate;
      var orient = this.currentOrientation;

      if (orient != null && orient !== undefined) {
          this.sensorData += Date.now() + ","
          + event.acceleration.x + ","
          + event.acceleration.y + ","
          + event.acceleration.z + ","
          + event.accelerationIncludingGravity.x + ","
          + event.accelerationIncludingGravity.y + ","
          + event.accelerationIncludingGravity.z + ","
          + event.rotationRate.alpha + ","
          + event.rotationRate.beta + ","
          + event.rotationRate.gamma + ","
          + orient.alpha + ","
          + orient.beta + ","
          + orient.gamma + "\n";
          this.dataCounter++;
          if(this.dataCounter >= 1000) {
            this.saveAccel();
          }
      }
    }
    window.addEventListener('devicemotion', this.accelEventListener);

    this.orientEventListener = (event: DeviceOrientationEvent) => {
      this.currentOrientation = event;
    }
    window.addEventListener('deviceorientation', this.orientEventListener);
  }

And here are the utility functions:

  saveAccel(){
    // Get a copy of the data. saveData is async
    var data_copy = (' ' + this.sensorData).slice(1);
    this.saveData(this.dirName + "/" + this.sensorFilename, data_copy);
    this.sensorData = "";
    this.dataCounter = 0;
  }

  private getFileName() {
    const date = new Date();
    return this.datepipe.transform(date, 'yyyyMMdd-HHmmss');
  }

  createFile(path: string) {
    try {
      Filesystem.writeFile({
        path: path,
        data: "",
        directory: FilesystemDirectory.Documents,
        encoding: FilesystemEncoding.UTF8
      })
    } catch(e) {
      console.error('Unable to create file', e);
    }
  }

  async saveData(path: string, data: string) {
    console.log(data.length);
    await Filesystem.appendFile({
      path: path,
      data: data,
      directory: FilesystemDirectory.Documents,
      encoding: FilesystemEncoding.UTF8
    });
  }

Stop button in UI will call the "stopCollect()" function:

  stopCollect() {
    window.removeEventListener('devicemotion', this.accelEventListener);
    window.removeEventListener('deviceorientation', this.orientEventListener);
    this.saveAccel();
    this.sensorFilename = "";
    this.touchFilename = "";
    this.currentOrientation = null;
    this.insomnia.allowSleepAgain();
    this.runMode = "";
    this.isRunning = false;
  }

And if you revisit the UI code, you will notice "div" section calls "touchStart()" and "touchEnd()" for purpose of data labeling:

  touchstart(event){
    if(this.runMode === "collect") {
      this.saveData(this.dirName + "/" + this.touchFilename, Date.now() + ",start\n");
    }
  }

  touchend(event){
    if(this.runMode === "collect") {
      this.saveData(this.dirName + "/" + this.touchFilename, Date.now() + ",end\n");
    }
  }

This is all that's needed to collect sensor data and store them on the device is a directory!

You can access these files and download them to your computer by connecting your device using a cable. For large scale data collections, it is better to build some more advances functionality to allow the app upload the data to some cloud storage.

Running the Model

After data is collected and a model is trained, it would be nice to run the model on a real device and see the performance of your model in action. In my case I used TensorFlow to build a deep learning model, so the natural choice was to use TensorFlow.js to run my model in the same (or different) application with minimal modification. To install TensorFlow.js, simply add it to your application:

npm install @tensorflow/tfjs@1.2.6

Note: I'm specifically using version 1.2.6 as I noticed versions 1.2.9 and above cause a failure when i try to build my app.

First step after building and training your model is to save it in a format which can be loaded by TensorFlow.js. TFJS comes with a tool to convert your model to the appropriate format. Feel free to read about how to use it here. This conversion generated 2 files for me (you may get more group/shard files):

model.json
group1-shard1of1.bin

I copied these files to "src/assets" directory of my app so they are included with the app. Now I was able to add a new button to UI so I can run the model instead of collecting data. New UI code looks like this:

<ion-content>
  <br />
  <ion-button color="success" (click)="startCollect()" [disabled]="isRunning">Start</ion-button>
  <ion-button color="danger" (click)="stopCollect()" [disabled]="!isRunning">Stop</ion-button>
  <ion-button (click)="startModel()" [disabled]="isRunning">Start testing the Model</ion-button>
  <div style="height:100%;width:100%" (touchstart)="touchstart($event)" (touchend)="touchend($event)">
  </div>
</ion-content>

This new button calls "startModel()" function:

  async startModel() {
    this.insomnia.keepAwake();
    this.runMode = "test";
    this.isRunning = true;

    this.model = await tf.loadLayersModel('/assets/model.json');
    this.oneSec = new Array();
    this.currentRow = 0;

    this.accelEventListener = (event: DeviceMotionEvent) => {
      var accel = event.acceleration;
      var accelg = event.accelerationIncludingGravity;
      var rotrate = event.rotationRate;
      var orient = this.currentOrientation;

      if (orient != null && orient !== undefined) {
        var row = [
          accel.x, accel.y, accel.z,
          accelg.x, accelg.y, accelg.z,
          rotrate.alpha, rotrate.beta, rotrate.gamma,
          orient.alpha, orient.beta, orient.gamma
        ];
        this.oneSec.push(row);
        this.currentRow++;
      }

      if (this.currentRow === 60) {
        var data = tf.tensor(this.oneSec);
        const output = this.model.predict(data.reshape([1, 60, 12])) as any;
        var prediction = output.arraySync();
        console.log(prediction);
        for(var i = 1; i < 2; i++) {
          if (prediction[0][i] > 0.9) {
            var detected = ((i === 1) ? "Activity Detected" : "") + " " + (prediction[0][i] * 100).toFixed(0);
            console.log(detected);
            if (detected !== "") this.presentToast(detected);
          }
        }
        this.oneSec = new Array();
        this.currentRow = 0;
      }
    }
    window.addEventListener('devicemotion', this.accelEventListener);

    this.orientEventListener = (event: DeviceOrientationEvent) => {
      this.currentOrientation = event;
    }
    window.addEventListener('deviceorientation', this.orientEventListener);
  }

In above code, a prediction is made every 1 second which is 60 samples. If the probability of detection is above 90%, we show a "toast" notification which lasts about 1 second. That's all there is to it!

A few new variables are added to the class:

  model:tf.LayersModel;
  oneSec:Array<any>;
  currentRow = 0;

To get the "toast" and TensorFlowJS to work, we need to make small modifications. For example, we need to import the libraries at the beginning of the module:

import { ToastController } from '@ionic/angular';
import * as tf from '@tensorflow/tfjs';

And toastController needs to be injected to the constructor:

  constructor(public datepipe: DatePipe, private insomnia: Insomnia, public toastController: ToastController) {
    // make sure directory exists
    this.checkDirectory();
  }

And "stopCollect()" needs to act a bit differently:

  stopCollect() {
    window.removeEventListener('devicemotion', this.accelEventListener);
    window.removeEventListener('deviceorientation', this.orientEventListener);
    if (this.model != null && this.model !== undefined) {
      this.model = null;
    } else {
      this.saveAccel();
      this.sensorFilename = "";
      this.touchFilename = "";
    }
    this.currentOrientation = null;
    this.insomnia.allowSleepAgain();
    this.runMode = "";
    this.isRunning = false;
  }

Conclusion

Data science projects are fun. It's not just about "give me the data and I build a model". You may need to decide what data to collect, build tools to collect the data, and later on run it in real world to see how it does.

Add comment