Setting up Hubs

Setting up of Hubs

The next step is to set up a hub. Since the hub will be processing a lot more data and uploading files and messages from many nodes to the Azure service bus, we opted for a more powerful microprocessor, the UDOO Quad.

The hub also acts as a node. It also carries out motion detection itself, along with taking data from other nodes and uploading it all.

We begin by setting up the hub as a node first. Follow the same procedure as we did in the previous section all the way until you have motion set up. Do not go on to set up the transportation methods (flashair, etc.) as we do not need to move the data from the hub, rather we need to upload it to the Azure service bus.

Step 1: Set up the folders

Figure 4: Folder setup

Step 2: Set up the scripts

Figure 5:

Figure 6:

Ensure that you have the file paths corrected accordingly.

Figure 7:

Step 3: Set up motion

videodevice /dev/video3 #default /dev/video0 but on the UDOO my camera was picked as video3
width 1280 # I have a better resolution webcam on this UDOO board, so I have set the resolution to 720p
height 720
framerate 30
threshold 2000
pre_capture 2
output_normal first
ffmpeg_video_codec msmpeg4
snapshot_interval 300
target_dir /home/ubuntu/flashair/sync
snapshot_filename %Y%m%d%H%M%S-%v-snapshot
jpeg_filename %Y%m%d%H%M%S-%v-%q 
movie_filename %Y%m%d%H%M%S-%v 
webcam_maxrate 5
webcam_localhost off

on_event_start python /home/ubuntu/ motion_detected_at_%Y%m%d%H%M%S-%v
on_event_end python /home/ubuntu/ motion_ended_at_%Y%m%d%H%M%S-%v
on_picture_save /home/ubuntu/scripts/ %f
on_movie_end /home/ubuntu/scripts/ %f

Figure 8: Motion settings

That ends the node setup in the hub.

We now begin adding the slight tweaks that allow the UDOO to act as a hub, i.e. upload all the data to the Azure service bus.

The first step is to enable the device to upload its own files to the service bus. To do this we need scripts that will take the details of the messages and the video/images from the log files and upload to the service bus.

Scripts for the hub

The first python script is to upload the messages from the log file. This is as below:


# Steven Johnston
# Umang Rajdev
# 29/08/2014

# Python script to upload messages to the Azure Service Bus

# Import required libraries. Ensure Azure Python APK is installed
from azure.servicebus import *
import os
import cPickle

# Specify the location of the files to upload
file_location = "/home/ubuntu/flashair/buffer/"

read_data_list = []

# Read data from the log file
with open(file_location + "message_log.log", 'r') as r:
  while True:
      item_name = cPickle.load(r)
    except EOFError:
print read_data_list

# Specify Service Bus address. Change this to your required address.
bus_service = ServiceBusService(service_namespace='<insert namespace>', account_key='<insert key>', issuer='<insert owner>')
topic = "<specify the topic you have created>"

# Upload each message read from the log file
for message in read_data_list:
  print message
  print message.custom_properties
# Send messages to specified topic
  bus_service.send_topic_message(topic ,message)
  print "Message " + message.body + " sent"

# Delete the log file
os.remove(file_location + "message_log.log")

Save the file in  the scripts folder in your home directory as

Next, we need a python script to upload the blobs (images and videos) as well as send a message with the link to the uploaded files. This is the script I use:


# Steven Johnston
# Umang Rajdev
# 29/08/2014

# Python script to upload blobs (images and videos) to the Azure Service Bus

# Import required libraries. Ensure Azure Python APK is installed
from azure.servicebus import *
from import *
import cPickle
import os.path
import os
import subprocess

#Specify the service bus address, file locations and other variables
bus_service = ServiceBusService(service_namespace='<insert namespace>', account_key='<insert account key>', issuer='<insert owner>')
blob_service = BlobService(account_name='<insert account name>', account_key='<insert account key>')
container_video = '<insert name of created video container>'
container_images = '<insert name of created image container>'
url_video = '<video url for blobs>'
url_images = '<image url for blobs>'
topic = '<your topic for messages>'
file_location = "/home/ubuntu/flashair/buffer/" #this is the log file location, same as directory from which the files are uploaded
upload_directory = "/home/ubuntu/flashair/uploaded/"  #this is the directory where uploaded files are moved to

read_data_list = []

# Read data of the the files from the log file
with open(file_location + "blobs_log.log", 'r') as r:
  while True:
      item_name = cPickle.load(r)
    except EOFError:

# Extract the data from the log file and upload
for line in read_data_list:
  (blob, mac, serial, clock, localdate, time_str) = line
  filename = os.path.split(blob)[-1]
  ext = os.path.splitext(blob)[-1].lower()
  print blob, filename, ext
# Upload videos to one container, images to another
  if ext == '.avi':
# Upload the video
    blob_service.put_block_blob_from_path(container_video, filename, file_location + filename)
# Create a message to send with the link of the uploaded file
    msg = Message("Motion was detected and a video has been uploaded to " + url_video + filename)
    msg.custom_properties = {'mac_address' : mac, 'serial' : serial, 'time' : clock, 'localdate' : localdate, 'time_str' : time_str} #need to find a better way to get serial numbers off sensors
# Send/Upload the message with the link
    bus_service.send_topic_message(topic ,msg)
# Move the file to the uploaded folder from the buffer["sudo", "mv", str(file_location + filename), str(upload_directory + filename)])
    print msg.body
# Upload the images
    blob_service.put_block_blob_from_path(container_images, filename, file_location + filename)
# Create the message with the image link
    msg = Message("Motion was detected and an image has been uploaded to " + url_images + filename)
    msg.custom_properties = {'mac_address' : mac, 'serial' : serial, 'time' : clock, 'localdate' : localdate, 'time_str' : time_str} #need to find a better way to get serial numbers off sensors
# Send/Upload the message    
    bus_service.send_topic_message(topic ,msg)
# Move the file from the buffer to the uploaded folder["sudo", "mv", str(file_location + filename), str(upload_directory + filename)])
    print msg.body

# Delete the log file
os.remove(file_location + "blobs_log.log")

Save this again in the scripts folder as

These scripts will upload the messages and files from the buffer folder. However, this only covers the uploads for the data collected by the hub itself. We need to upload the data we have gathered from the several nodes. We will be utilising the same scripts, the only difference being the file locations for the files to be uploaded.

We will first create the folders where the incoming data from the nodes will reside.

cd flashair
sudo mkdir nodes
cd nodes
sudo mkdir node1
cd node1
sudo mkdir buffer uploaded
cd ..
sudo cp node1 -r node2
sudo cp node1 -r node3

Figure 9: Making the directories and subdirectories for node data

I have created 3 folders, assuming of course that we have 3 nodes. My demonstration only shows the set up with one node, however you can simply replicate the result if you have more nodes.

We now use the same and files, having changed only the file locations.

I simply copy the files from the scripts folder and place them in a sub-directory called ‘nodes’ in the scripts folder. Another change is to alter the file location to that where the node data is (i.e. the folders we just created).

A sample of my file in /home/ubuntu/scripts/nodes:

Figure 10: for the nodes

You will need to keep a copy of the 2 files for each node that you have. This is only because of the file locations that are specified in each file. This is something we are working to solve in the future, such that the user only needs one script for all the nodes.

Transporting the data (Receiving from the nodes)

Once this is set up, we need a way to receive the files from the nodes. As in the node setup, we will have 2 methods to receive the files.

Option 1: To receive through External Storage

We need to create a script similar to the that we had in the node setup. Go to the scripts folder and create a file called

cd scripts
sudo nano copyfromusb

This script is quite similar to the, the only difference being where we copy from and to.

touch /home/ubuntu/test
sudo mount -t vfat -o uid=ubuntu,gid=ubuntu /dev/sda1 /media/RSIS/
sleep 10
mv -v /media/RSIS/buffer/* /home/ubuntu/flashair/nodes/node1/buffer/
touch /home/ubuntu/test2
sleep 10
sudo umount /media/RSIS

Make sure you save and make the file executable again.

We now need to invoke this script every time a flash disk is inserted into the UDOO Quad. Let us add this as a rule in udev:

cd /etc/udev/rules.d/
sudo nano 10-imx.rules

Notice how the file in the rules.d folder is different from the one we had in the nodes. This file may vary from system to system. You may have to find the one that works for you by trial and error.

Now add this line to the bottom of the file, and save:


You may also have to check that on the hub your flashdrive is read as sda1. If not, make sure you change this.

Option 2: WiFi Card

We now need to get a script to download the files off the flashair card and then upload them onto the service bus. There already exists a script for this, which is available on the following github link:

Download the .zip file, and extract the contents to the scripts folder.

Figure 11: Contents of the extracted PyFlashAero-master folder

However, I have modified the contents of the some of the files. Navigate to the ‘src’ folder, and replace with this file:

Replace in the ‘FlashAir’ folder in ‘src’ with this

Replace in the same folder with

And finally with

Some of the changes I have made have been commented in the respective files. Basically I have tried to adapt the uploader to our scenario by making a few tweaks in the files.

Also, in order for this downloader to work you will need to have Python’s PyQt4 installed. The script runs in python3, and hence we will install this for python3:

sudo apt-get install python3-pyqt4

This completed the installation and set up of all the scripts needed for the hub.

Finally, we will write a script and save it in the if-up.d folder, such that it runs every time a network is up, and will download the files off the SD card if the connected network is flashair.

cd /etc/network/if-up.d
sudo nano download

The contents of the download folder are:

#!/bin/bash sleep 5 iwconfig wlan0 | grep “flashair” > /home/ubuntu/debug.txt if iwconfig wlan0 | grep “flashair” ; then touch /home/ubuntu/testdownload python3 /home/ubuntu/scripts/PyFlashAero-master/src/ –card_uri http://flashair –folder_remote / –folder_local /home/ubuntu/flashair/nodes/node1/buffer

else touch /home/ubuntu/testdownloadfailed fi

Ensure that you save and make the file executable.

The script is quite similar to the one we used to upload to the flashair. All that it does is check that we are connected to flashair, and if we are, it runs the script, providing the necessary details such as the local folder to download to.


We now need a way to utilise all these scripts. Crontab is the perfect utility for this. It allows us to run a script at a specified interval. For more information on crontab see the man page, type “man crontab”.

What we want to do is write a script that runs all of our upload scripts, and have that running continually using crontab, such that every minute if new files are added to the buffer, they will be uploaded to the service bus.

We write a script for this and place it in the scripts folder in the home directory.

cd scripts
sudo nano

The script contains:


if [ -f /tmp/mylockfile ]; then exit 1; fi
touch /tmp/mylockfile

python /home/ubuntu/scripts/
python /home/ubuntu/scripts/
python /home/ubuntu/scripts/nodes/
python /home/ubuntu/scripts/nodes/

rm /tmp/mylockfile

The script works by writing a lockfile in the tmp folder, and then running the required scripts in turn. This helps because I will set the crontab to run every minute. However, at times, running the upload scripts above may take longer than one minute, so we don’t want to invoke the same script over the other, and this would cause issues. To avoid this, we write a tmp file while the scripts are running and delete it after we are done. The first line simply checks if the lockfile exists, and if it does it exits, since it means uploads are still running in the background.

Ensure you save the file and make it executable.

Finally to wrap up, we will run this script in the users crontab.

crontab –e

Add this line to the bottom of the file to run the script every minute.

* * * * * /home/ubuntu/scripts/

Figure 12: Addition to crontab

That completes the set up for hubs.

Leave a Reply

Your email address will not be published. Required fields are marked *