Laboratory #4c
Data Analysis and Pattern Recognition
3. Classifying BioMuse Data
3.1 Introduction
The experiments in this part of Lab 4 use the Analysis -> Covariance
tool in the Pattern Recognition and Feature Extraction Toolbox
. The purpose is to use measured BioMuse data to create a classifier to
discriminate two gestures.
3.2 Gathering the BioMuse Data
The procedure for gathering BioMuse gesture data is exactly the same
as in Lab #3, Part 3.2, except that we will now save the data in files.
Those of you that saved your data can skip the collection and
saving parthere.
1. Connect the BioMuse (or MiniMuse) to the PC serial port.
2. Plug in the BioMuse (or MiniMuse) to the wall.
3. Snap in the electrodes into the arm bands.
4. Place one arm band on the arm so that the center electrode is on the
back of the forearm about 3/4 of the way up toward the elbow.
5. Plug the armband into the box.
6. Run Matlab.
7. Load the program called bio-daq.prj in LabWindows CVI and run it. (Recall
that this program collects two channels of EMG envelope data.)
8. Set the gain on channel 1 to 127, and set the gain on channel 2 to 0.
Set the "device" to 1 and the "channel" to 0. A green
light should appear.
9. Click on the start button.
10. Flex your muscle on the back of your arm by bending your wrist backwards.
You should see VU meter #1 move when you flex and go to zero when relaxed.
11. Click the stop button.
12. Place the other arm band on the front of your forearm with the center
sensor approximately opposite of the first band.
13. While still running bio-daq, increase the gain on channel 2 to 127.
You should see both meters going up and down.
14. Press the start button and bend your hand forward. You should see both
meters going up and down.
15. After about 6 seconds, press the stop button.
16. Press the Mat-Lab button and wait a few seconds.
17. Typing "who" in Matlab should reveal "muse_data1"
and "muse_data2". Use the Matlab "cd" command to connect
to your directory. Then save the measurements to a file in a form that can
be used by lab4 by typing
data = muse_data1;
save forward1 data;
data = muse_data2;
save forward2 data;
data = [muse_data1 muse_data2];
save forward data;
(What we are doing here is saving muse_data1 and muse_data2 as two one-dimensional
arrays for graphing, and as one two-dimensional array for covariance analysis.)
18. Press the start button and bend your hand back.
19. After about 6 seconds, press the stop button.
20. Press the Mat-Lab button and wait a few seconds.
21. Save the data to a file by typing
data = muse_data1;
save back1 data;
data = muse_data2;
save back2 data;
data = [muse_data1 muse_data2];
save back data;
3.3 Classifying the BioMuse Data
We begin by inspecting the data to see if it appears to be valid.
1. Type lab4
2. Choose Graph -> Lab Data, and then choose Options -> Plot Data
-> Two Files. Enter forward1 and forward2 for
File 1 and File 2, and click "Continue". Enter the range of points
that you want to inspect and click "Continue". The graphs should
show the data for the sensors in the two arm bands. These are the two "features"
that we will use to discriminate the forward gesture from the backward gesture.
Although there will be fluctuations in the measurements, through most of
the interval you should have roughly constant values for the features, something
like the graphs shown below. If your measurements are very different, you
might want to try getting a new set.
3. Choose Options -> Exit.
4. Choose Analysis -> Covariance
5. Choose Options -> Covariance Analysis
6. Enter forward and back for the Files to be Analyzed and
click "Continue".
7. Enter 1 for Feature 1 and 2 for Feature 2, and click "Continue".
8. We will use the holdout method for testing. For both File 1 and File
2, reserve the second half of the range for test, and click "Continue".
- Explain why the data points do not seem to be randomly scattered,
but instead trace out trajectories,
9. Display the Euclidean Seperator and the Mahalanobis Separator;
use Analysis -> Compute Euclidean Statistics and Analysis -> Compute
Mahalanobis Statistics to find the percentage of points correctly classified.
- Record the Euclidean and the Mahalanobis statistics.
10. There are three common reasons why performance at this point might be
rather poor:
- You may have not positioned the sensors well.
- Your data may contain a mixture of points where the hand was in an
"in between" position.
- You are not taking advantage of the fact that the hand stays in one
position for an extended period of time.
The way to handle the first case is obvious. The third problem is not too
hard to solve, but it requires a sequential algorithm that is beyond the
scope of the lab4 software. The following steps address the
second problem.
11. Repeat Step 2. Find an interval in which the data seem to have "settled
down". For example, in the figure shown above,
the interval from 100 to 400 seems pretty stable. Suppose that you have
an interval of stable data from n1 to n2.
Extract that interval from your data by typing the following Matlab commands:
load forward;
data = data(n1:n2, :);
save forward2 data;
load back;
data = data(n1:n2,:);
save back2 data;
12. Repeat steps 4 to 9, except use forward2 and back2
in Step 6.
- Compare the performance to the results obtained in Step 9.
- Which classifier is better, the Mahalanobis classifier or the Euclidean
classifier? Can you explain the reason for the difference?
- With live data, it might not be obvious that we are in a "stable"
situation that has "settled down"; suggest a way to reject points
that are in a "transition" state between stable states.
- Is this test sufficient to determine the error rate in using the BioMuse
to discriminate these two gestures? If not, describe what would be required
to obtain a reliable estimate of the error rate in practice.
On to Lab #4, Part d: Radio Baton Data
Up to Lab #4 and 5