Face Detection with the Raspberry Pi Camera Board

I have a very basic face detection routine running with the Raspberry Pi camera board.

To do this I used Robidouille’s library functions (see previous post). I then modified the raspicam_cv.c example to use the face detection routine from Learning OpenCV. There were some tweaks so I will post the code below. You also need to modify the makefile to include the OpenCV object detection libraries.


/*

Modified from code supplied by Emil Valkov (Raspicam libraries) and Noah Kuntz (Face detection)

License: http://www.opensource.org/licenses/bsd-license.php

*/

#include <cv.h>
#include <highgui.h>

#include "RaspiCamCV.h"

int main(int argc, const char** argv){

//Initialise Camera object
 RaspiCamCvCapture * capture = raspiCamCvCreateCameraCapture(0); // Index doesn't really matter

 //initialise memory storage for Haar objects
 CvMemStorage* storage = cvCreateMemStorage(0);

 //Set up Haar Cascade - need quoted file in directory of program
 CvHaarClassifierCascade* cascade = (CvHaarClassifierCascade*)cvLoad( "haarcascade_frontalface_alt2.xml", 0, 0, 0);

 //Set scale down factor
 double scale = 1.8;

//Set colours for multiple faces
 static CvScalar colors[] = { {{0,0,255}}, {{0,128,255}}, {{0,255,255}}, {{0,255,0}}, {{255,128,0}}, {{255,255,0}}, {{255,0,0}}, {{255,0,255}} };

 //Open Window for Viewing
 cvNamedWindow("RaspiCamTest", 1);

 //Loop for frames - while no keypress
 do {
 //Capture a frame
 IplImage* img = raspiCamCvQueryFrame(capture);

 //Clear memory object
 cvClearMemStorage( storage );

 // IMAGE PREPARATION:
 //Initialise grayscale image
 IplImage* gray = cvCreateImage( cvSize(img->width,img->height), 8, 1 );

 //Shrink image
 IplImage* small_img = cvCreateImage(cvSize( cvRound(img->width/scale), cvRound(img->height/scale)), 8, 1 );

 //Convert to gray
 cvCvtColor( img, gray, CV_BGR2GRAY );

 //Resize to small image size
 cvResize( gray, small_img, CV_INTER_LINEAR );

 //Finished with gray image - release memory
 cvReleaseImage( &gray );

 //Vertical flip image as camera is upside down
 cvFlip(small_img, NULL, -1);

 //Equalise
 cvEqualizeHist( small_img, small_img );

 // Detect objects - last arg is max size -test parameters to optimise
 //Will detect biggest face with 6th arg as 4
 CvSeq* objects = cvHaarDetectObjects( small_img, cascade, storage, 1.1, 4, 4, cvSize( 40, 50 ), cvSize(small_img->width, small_img->height));

 int i;
 // LOOP THROUGH FOUND OBJECTS AND DRAW BOXES AROUND THEM
 for(i = 0; i < (objects ? objects->total : 0); i++ )
 {
 CvRect* r = (CvRect*)cvGetSeqElem( objects, i );

 //My compiler doesnt seem to be able to cope with default variables - need to specify all args - need to change '.' to '->' as r is pointer

 //This line appears to be the problem
 cvRectangle(small_img, cvPoint(r->x,r->y), cvPoint(r->x+r->width,r->y+r->height), colors[i%8], 2, 8, 0);
 }

 cvShowImage("RaspiCamTest", small_img);
 //cvReleaseImage( &gray );
 cvReleaseImage( &small_img );

 } while (cvWaitKey(10) < 0);

 //Close window
 cvDestroyWindow("RaspiCamTest");

 //Release memory
 raspiCamCvReleaseCapture(&capture);

 return 0;

}

Makefile:


OBJS = objs

CFLAGS_OPENCV = -I/usr/include/opencv
LDFLAGS2_OPENCV = -lopencv_highgui -lopencv_core -lopencv_legacy -lopencv_video -lopencv_features2d -lopencv_calib3d -lopencv_imgproc -lopencv_objdetect

USERLAND_ROOT = $(HOME)/git/raspberrypi/userland
CFLAGS_PI = \
 -I$(USERLAND_ROOT)/host_applications/linux/libs/bcm_host/include \
 -I$(USERLAND_ROOT)/host_applications/linux/apps/raspicam \
 -I$(USERLAND_ROOT) \
 -I$(USERLAND_ROOT)/interface/vcos/pthreads \
 -I$(USERLAND_ROOT)/interface/vmcs_host/linux \
 -I$(USERLAND_ROOT)/interface/mmal \

LDFLAGS_PI = -L$(USERLAND_ROOT)/build/lib -lmmal_core -lmmal -l mmal_util -lvcos -lbcm_host

BUILD_TYPE=debug
#BUILD_TYPE=release

CFLAGS_COMMON = -Wno-multichar -g $(CFLAGS_OPENCV) $(CFLAGS_PI) -MD

ifeq ($(BUILD_TYPE), debug)
 CFLAGS = $(CFLAGS_COMMON)
endif
ifeq ($(BUILD_TYPE), release)
 CFLAGS = $(CFLAGS_COMMON) -O3
endif

LDFLAGS =
LDFLAGS2 = $(LDFLAGS2_OPENCV) $(LDFLAGS_PI) -lX11 -lXext -lrt -lstdc++

RASPICAMCV_OBJS = \
 $(OBJS)/RaspiCamControl.o \
 $(OBJS)/RaspiCLI.o \
 $(OBJS)/RaspiCamCV.o \

RASPICAMTEST_OBJS = \
 $(OBJS)/RaspiCamTest.o \

TARGETS = libraspicamcv.a raspicamtest

all: $(TARGETS)

$(OBJS)/%.o: %.c
 gcc -c $(CFLAGS) $< -o $@

$(OBJS)/%.o: $(USERLAND_ROOT)/host_applications/linux/apps/raspicam/%.c
 gcc -c $(CFLAGS) $< -o $@

libraspicamcv.a: $(RASPICAMCV_OBJS)
 ar rcs libraspicamcv.a -o $+

raspicamtest: $(RASPICAMTEST_OBJS) libraspicamcv.a
 gcc $(LDFLAGS) $+ $(LDFLAGS2) -L. -lraspicamcv -o $@

clean:
 rm -f $(OBJS)/* $(TARGETS)

-include $(OBJS)/*.d

Hacker News Update: Raspicam & WeMo

A quick update on my recent discoveries.

Raspicam

I now have a Raspberry Pi Camera Board (Raspicam)!

There is a brilliant combo deal on at the moment allowing you to buy a Raspicam, Model A + 4GB SD card for about £35 (including VAT + shipping!)! That’s £35 for a device that can run OpenCV with a camera capable of 30fps at HD resolutions. I will leave you to think about that for a moment.

The downside is that the software is still not quite there. The Raspicam couples directly to the Raspberry Pi; this means it is not (at the moment) available as a standard USB video device (e.g. /dev/video0 on Linux). Now most Linux software and packages like SimpleCV work based on a standard USB video device. This means as of 24 October 2013 you cannot use SimpleCV with the Raspicam.

However, not to fret! The Internet is on it. I imagine that we will see better drivers for the Raspicam from the official development communities very soon. While we wait:

WeMo and Python

As you will see from the previous posts I have been using IFTTT as a make-shift interface between my Raspberry Pi and my WeMo Motion detector and switch.  This morning though I found a Python module that appears to enable you to control the Switch and listen to motion events via Python. Hurray!

The module is called ouimeaux (there is a French theme this week). Details can be found here: link.

Very soon I hope to adapt my existing code to control my Hue lights based on motion events (e.g. turn on when someone walks in the room, turn off when no motion). Watch this space.

Magic Lights: Controlling Hue with a Raspberry Pi

This post shows you how to control the Hue light bulbs using a Raspberry Pi. In particular, it shows you how to fade up your lights half an hour before to sunset.

20131006-073131.jpg

In true Heath Robinson style the solution is rather convoluted. But hey, that’s where the fun resides.

Email me, Sun

First, we set up IFTTT to email us sunrise and sunset times.

20131006-073820.jpg
This involves setting up two recipes triggered based on the weather channel.

Read my emails, Pi

Now we need to set up a Python script to read our Gmail. To set this up I first, via the Gmail web interface, set up the account to auto-label emails from IFTTT containing the word “Sun” as “Sun”. This makes the emails easier to read as we then only need to read the email ‘folder’ “Sun”.

The email reading script is similar to that used in the previous post.

#!/usr/bin/env python

import imaplib
from email.parser import HeaderParser
import sqlite3 as lite
import datetime

def extract_date(word):
	date_index_start = word.find('for ')+4
	date_index_end = word.find(' at')+11
	date_out = word[date_index_start:date_index_end]
	return date_out

def extract_sunevent(word):
	word_out = word[0:7]
	return word_out.strip()

def read_subjects(label):
	obj = imaplib.IMAP4_SSL('imap.gmail.com', '993')
	obj.login('username@gmail.com', 'password')
	obj.select(label)
	typ ,data = obj.search(None,'UnSeen')

	subjects =[]

	for num in data[0].split():
		data = obj.fetch(num, '(BODY[HEADER])')
		header_data = data[1][0][1]
		parser = HeaderParser()
		msg = parser.parsestr(header_data)
		subjects.append(msg['Subject'])
	return subjects

def storesuninsql(rows):
	#Save in database
	con = lite.connect('/home/[pi username]/sun.db')

	with con:
	    cur = con.cursor()
	    #Create a sun event table if it doesn't already exist
	    cur.execute('CREATE TABLE IF NOT EXISTS sun (r_datetime TIMESTAMP, sunevent TEXT)')

	    for row_values in rows:
	    	#print row_values
	    	#row_values = rows[i]

	    	cur.execute('INSERT INTO sun VALUES(?,?)', (row_values[0], row_values[1]))

def store_sun(subjects):
#Initialise temporary array for data
	rows = []
	unread_count = len(subjects)

	#Process and store unread mail items
	for j in range(0,unread_count):
		#print subjects[j]
		#Extract date/time of sun event
		extracted_date = extract_date(subjects[j])
		#Extract time of sunevent
		event_time = datetime.datetime.strptime(extracted_date, '%B %d, %Y at %I:%M%p')
		#Extract event name
		event_name = extract_sunevent(subjects[j])
		#Add (event time, event name) tuple to rows
		rows.append((event_time, event_name))
	storesuninsql(rows)

if __name__ == '__main__':
	subjects = read_subjects('Sun')
	store_sun(subjects)

The code ‘reads’ the emails using the imaplib Python library. It extracts the subject lines, which indicate the sun event (“Sunset” or “Sunrise”), the date and the time. It then stores a datetime stamp in an SQLite database with a text field indicating the event type.

This Python script is then scheduled to run (via cron) overnight.

Fade Up

Now we write a little Python script to fade up all our Hue light bulbs. This has become a lot easier since Philips launched the official Hue API and documentation in March 2013 (fair play to Philips – I think this is done really well).

First go to the Getting Started page of the Hue Developers site. Follow the simple steps there to add and authenticate a new user.

Next we write the code. In short summary the Hue API works using an HTTP PUT request. This passes a JSON object (basically a string) containing ‘variablename:value’ pairs that sets the state of a bulb. More detail is found on the Hue Developers site. To fade up I use the ‘transitiontime’ variable, which I set to 6000 (6000*100ms=10minutes). To be snazzy I use xy values for the D65 standard illuminant – [0.32,0.33]. Another nice colour, with an orangey sunset feel has xy values of around [0.43,0.53].

The code is:

import requests
import time

ip = "[Your Bridge IP Address]"
sunrise='{"on":true,"bri":0,"xy":[0.32,0.33]}'

def set_light(light, data):
	global ip, key
	requests.put("http://%s/api/[username]/lights/%s/state" % (ip, light), data=data)

def fade_up_all():
	for i in range(1,4):
		set_light(i, sunrise)
		time.sleep(1)
		data = '{"on":true, "bri":255, "transitiontime":6000}'
		set_light(i, data)

fade_up_all()

PS: I did try to use a for-loop to iterate through brightness values from 1 to 255 but this caused some buggy behaviour – the lights would flash and jump in brightness. The ‘transitiontime’ variable is a great improvement.

Cron Legacy

Finally we have my favourite bit – editing a cron tab via Python to schedule our fade up script half an hour before sunset.

Luckily some lovely person has provided a module called, unsurprisingly, python-crontab. To install it I first had to install a utility called dateutil:

sudo pip install python-dateutil
sudo pip install python-crontab

The documentation can be found at the above link. The comments in the code below will hopefully explain how to use it – it’s pretty simple. The only trick is to work out the syntax: [crontab object].[crontab field].on([value]) sets that field value, for example: cron.minute.on(5) sets the minute field to 5. Also you need to clear each field (e.g. cron.minute.clear()) before setting using .on() otherwise it add the value to the existing value (e.g. an existing value of ‘0’ and the previous would give ‘0,5’). In the code below the first function extracts the time of the last sunset from the database and returns hour and minute values for a time half an hour before yesterday’s sunset (the trick to that is to use timedelta). Also remember with cron to use absolute paths in your code – cron jobs are typically run as if you were in your home directory.

from crontab import CronTab
import datetime
import sqlite3 as lite

def gettime():
	con = lite.connect('/home/[pi username]/sun.db')

	#Get last sunset time
	cur = con.execute("SELECT r_datetime FROM sun WHERE sunevent='Sunset' ORDER BY r_datetime DESC LIMIT 1")
	#We can use fetchone() as only one record will be returned
	record = cur.fetchone()
	sunsetdt = datetime.datetime.strptime(record[0], "%Y-%m-%d %H:%M:%S")
	timearray = [sunsetdt.hour, sunsetdt.minute]
	return timearray

def setcron(timearray):
	#timearray input is an array in the form [hour, minute]
	hour = timearray[0]
	minute = timearray[1]
	user_cron = CronTab('[Your raspberry pi username]') #Open crontab for user

	list = user_cron.find_comment('sunon') #look for existing cron job for fade up lights using comment
	if not list: #if no job added
		job = user_cron.new(command='python /home/[pi username]/MyCode/Hue/fadeupall.py',comment='sunon') #Add new job
	else: #if existing jon
		job = list[0] #select first job returned from search
	job.minute.clear() #Clear previous minute
	job.hour.clear() #Clear previous hour
	job.minute.on(minute) #set minute value
	job.hour.on(hour) #set hour value
	job.enable() #enable job
	user_cron.write() #write crontab

timearray = gettime()
setcron(timearray)

The above code is scheduled to run once a day sometime after midnight, after the script to read my emails above. At some point I’ll get all this on to github.  Update: I have finally added the code (updated in places) to github: https://github.com/benhoyle/Hue.

Hey presto. My lights turn on half an hour before sunset everyday!

Doing Useful Things with WeMo Motion

Using both the WeMo Motion rules and IFTTT allows you to do certain things with this motion detector. However, to expand our possibilities it would help if we could store our motion data and make it accessible to the programs that we write.

20131004-063848.jpg
To store our motion data in a database we need a bit of a convoluted process. It goes something like this:

20131004-063802.jpg
First we set up an IFTTT recipe to send an email to a Gmail account when motion is detected.

20131004-064517.jpg
Having done this you will get a series of emails:

20131004-064837.jpg
I recommend setting up a separate Gmail account for automation to avoid spamming yourself with IFTTT emails. It would also make things more secure for the next steps. To make things easier when using multiple IFTTT recipe emails, I set up a filtering rule in Gmail to automatically label all emails like this as “Motion”.

The next step is to write some Python code to access our emails, process messages and store data in an SQLite database.

  • The email processing makes use of the imaplib and HeaderParser libraries; and
  • The database processing makes use of the sqlite3 libraries.

A first function accesses all unread emails with a particular label and returns an array of the subject lines of those emails.

def read_subjects(label):
	obj = imaplib.IMAP4_SSL('imap.gmail.com', '993')
	obj.login('username@gmail.com', 'password')
	obj.select(label)  # <--- it will select inbox
	typ ,data = obj.search(None,'UnSeen')

	subjects =[]

	for num in data[0].split():
		data = obj.fetch(num, '(BODY[HEADER])')

		header_data = data[1][0][1]

		parser = HeaderParser()
		msg = parser.parsestr(header_data)
		#print msg['Subject']
		subjects.append(msg['Subject'])

	return subjects

A second set of functions then processes each subject line to extract a ‘datetime’ that the motion occurred and a motion sensor name.

def store_motion(subjects):
#Initilise temporary array for data
	rows = []
	unread_count = len(subjects)-1

	#Process and store unread mail items
	for j in range(0,unread_count):
		#print subjects[j]
		#Extract date/time of last motion
		extracted_date = extract_date(subjects[j])
		#Extract motion time
		motion_time = datetime.datetime.strptime(extracted_date, "%B %d, %Y at %I:%M%p")
		#Extract sensor name
		s_name = extract_sensor(subjects[j])
		#Add (motion time, sensor name) tuple to rows
		rows.append((motion_time, s_name))
	#print rows
	storemotioninsql(rows)

def extract_date(word):
	date_index_start = word.find("ion: ")+5
	date_index_end = word.find(" at")+11
	date_out = word[date_index_start:date_index_end]
	return date_out

def extract_sensor(word):
	name_end = word.find("' ")
	word_out = word[1:name_end]
	return word_out

A third function stores the prepared ‘datetime’ and motion sensor name in an SQLite database.

def storemotioninsql(rows):
	#Save in database
	con = lite.connect('motion.db')

	with con:

	    cur = con.cursor()

	    #Create a READINGS table if it doesn't already exist
	    cur.execute('CREATE TABLE IF NOT EXISTS motion (r_datetime TIMESTAMP, r_s_name TEXT)')

	    for row_values in rows:
	    	#print row_values
	    	cur.execute('INSERT INTO motion VALUES(?,?)', (row_values[0], row_values[1]))

All that remains is to set these functions up in a python script and then use cron to schedule it to run every 15 minutes (crontab -e etc…).