联系方式

  • QQ:99515681
  • 邮箱:99515681@qq.com
  • 工作时间:8:00-21:00
  • 微信:codinghelp

您当前位置:首页 >> Matlab编程Matlab编程

日期:2025-01-06 09:01

Machine Learning for Engineers

Term 1 2024/5

Instructions for Assignment

Introduction:  In this assignment you will generate a series of data sets and perform regression and classification using different methods implemented in MATLAB.  You will require access to a computer running MATLAB v. 2012a or higher.

Follow the instructions in this document to perform the tasks and generate as well as save the results in the required formats. Complete all steps in Part A and Part B. At the end of the assignment you will be required to produce a report (see Assignment 1 Report Questions). This assignment is worth 50% of the final mark.

PART A: REGRESSION

1.  Start MATLAB.

2.  Generate a data set from the following function:  


with randomly added standard Gaussian noise . Start by specifying the input values , . Pick 10 equally spaces values between 𝑥 = 0 and 4.5 by issuing the following command in the MATLAB command window:

>> x=0:0.5:4.5;

Now generate the outputs/targets , , from the polynomial above. Evaluate the polynomial at each   and add to each value a number generated from 𝑁(0,1) by issuing the following command in the command window:

>> y=3-2*x+4*x.^2-5*x.^3+x.^4 + 4*randn(1,10)

This creates a vector y that has 10 components, which are the targets . The noise is added by generating a vector of 10 random numbers from 𝑁(0,16) (variance of 16) using randn(1,10). The function randn(1,10) generates numbers from a standard normal (zero mean, unit variance). To generate numbers from a general normal distribution 𝑁(𝜇, 𝜎2), we would add 𝜇 and multiply by the standard deviation 𝜎, i.e. +mu+sigma*randn(1,10). Here mu is 𝜇, and sigma is 𝜎. To see why this is true, define , where . Using the rules  and   we see that .

Save the inputs and targets as files inputs.dat and outputs.dat in ascii format. Generate test data from the true function 𝑓(𝑥) by issuing the following commands:  

>> xt=0:0.15:4.5

>> yt=3-2*xt+4*xt.^2-5*xt.^3+xt.^4

Save xt and yt in files testinputs.dat and testoutputs.dat in ascii format.

3.  Now you will perform linear regression using polynomials of varying degrees 𝑀. Assume the modl:  


4. Open a script in MATLAB and type the following:

function P=DesignMatrix(x,M,N)

for i=0:M

   P(1:N,i+1)=x.^i;

end

Save the script as a MATLAB ‘.m’ file called DesignMatrix.m. You have created a function DesignMatrix(x,M,N) that will calculate the design matrix P for N inputs saved in a vector x, assuming polynomial basis functions up to order M.

5. x is already defined. Define the number of data points N=10 and the number of test points Nt=31 in the command window. Finally, set the polynomial order as M=1 in the command window. Calculate the design matrix and call it P by issuing the command:

>> P=DesignMatrix(x,M,N)

You should be able to check if the matrix P is correct based on the definition of the design matrix. Now find the Maximum Likelihood solution (see the slides) for the vector of weights

>> w=inv(P'*P)*P'*y'

Note that y is a row vector, so it needs to be transposed in the formula.

6.  Calculate the predicted values of the function at the test inputs xt. First define a design matrix Pt corresponding to the test inputs and then multiply by w. Think about why this is the case by examining the formulae in the slides. This is more compact when you move to higher values of M in step 10. below. The predicted values are collected in a vector fP.

>> Pt=DesignMatrix(xt,M,Nt)

>> fP=(Pt*w)’  

You can alternatively use fP=w(1)+w(2)*xt.

7.  Plot the data points, the predicted values at the test points and the test data in one figure:

>> plot(x,y,’ro’);

>> hold on  

>> plot(xt,fP,’b*’);

>> plot(xt,yt,’ks’);

Note the transpose in fP=(Pt*w)’. You have to transpose otherwise you will get errors when trying to calculate the squared error below.

8.  Save the figure as a MATLAB figure. Also save it as an EPS file. Make sure the axes are labelled and a legend is placed in the figure.

9.  Calculate the total square error against the test values, calling it err

>> err=sum((fP-yt).^2)

Alternatively, you can use a for loop:  

for i=1:31

err(i)=(fP(i)-yt(i))^2

end

The command sum(v) sums the components of a vector v. Record this error.  

10. Now repeat the procedure in steps 5. to 9. using higher order polynomials M=3, M=4 and M=9, saving the figure with all plots and recording the total square error each time.  

11. Produce and save a plot of the total square error vs. the polynomial dimension M.

12. Issue the command:

>> clear all  

to clear all of the variables from the MATLAB workspace.

PART B: CLASSIFICATION

1.  Clear all variables or restart MATLAB.

2.  Generate a synthetic data set for binary classification:

>> n0=35; %number of data points in class 1

>> n1=35; %number of data points in class 2

>> stdv=0.25; %standard deviation of noise in data

>> nt=n0+n1; %total number of data points

>> y=[zeros(n0,1);ones(n1,1)]; %vector of outputs  

>> x=[y+stdv*randn(n0+n1,1),y+stdv*randn(n0+n1,1)]; %inputs

Note that the green text (after the %) is just a comment to explain the steps. It is ignored by Matlab and you do not need to type it. In the above, you are using the commands zeros and ones. Look at the Matlab documentation for explanations. You can plot the inputs and indicate their class using blue circles (class 1) and red squares (class 2):

>> plot(x(1:n0,1),x(1:n0,2),'bo','MarkerSize',10);

>> axis equal

>> hold on;

>> plot(x(n0+1:nt,1),x(n0+1:nt,2),'rs','MarkerSize',10);

3.  Write a MATLAB function in an m file that calculates the projection direction w and the orthogonal projection of all inputs:

function [w,xp,x1p,x0p]=fisher_train(x,y,n0,n1)

%Calculate the orthogonal projection direction w

x0=x(y==0,:); %inputs in class 1

x1=x(y==1,:); %inputs in class 2

mu0=mean(x0); %mean of inputs in class 1

mu1=mean(x1); %mean of inputs in class 2

S0=cov(x0)*(n0-1); %within class scatter matrix for class 1

S1=cov(x1)*(n1-1); %within class scatter matrix for class 2

w=inv(S0+S1)*(mu1-mu0)'; %direction for projection  

w=w/norm(w); %normalise the direction  

xp=x*w; %coefficients of the projected inputs onto w  

x00=xp(y==0,:); %projection coefficients corresponding to class 1

x11=xp(y==1,:); %projection coefficients corresponding to class 2

for i=1:n0 %orthogonal projection of class 1 points onto w

   x0p(i,:)=x00(i)*w;

end

for i=1:n1 %orthogonal projection of class 2 points onto w

   x1p(i,:)=x11(i)*w;

end

The MATLAB command x(y==0,:) using == selects points according to the criterion y=0. Also used are the MATLAB commands mean and cov. Look at the documentation for their meanings.  

Save the file as fisher_train.m.  

4.  Call the function fisher_train to find the projected inputs x0p and x1p

>> [w,xp,x1p,x0p]=fisher_train(x,y,n0,n1)

Plot the original inputs x as blue circles for class 1 and red squares for class 2. On the same figure, plot the projected points x0p as filled blue circles and x1p as filled red squares. The latter plots can be obtained from:

plot(x0p(1:n0,1),x0p(1:n0,2),'bo','MarkerSize',10,'MarkerFaceColor','b');

grid on;

hold on

plot(x1p(1:n1,1),x1p(1:n1,2),'rs','MarkerSize',10,'MarkerFaceColor','r');

Save the figure in EPS format.

PART C: REGRESSION VIA ANN

1.Clear all variables or restart MATLAB.

2.Generate a synthetic data set for training and testing based on the underlying function


by typing the following in the command window

>> ntrain = 32;        %number of training data

>> ntest = 256;        %number of testing data

>> xTrain = rand(1, ntrain)';  %training inputs

>> xTest = linspace(0,1,ntest)';   %test inputs

>> yTest = 28*sin(xTest*4*pi).*exp(-xTest.*3); %training outputs

>> yTrain = 28*sin(xTrain*4*pi).*exp(-xTrain.*3); %training outputs

>> nu=0.5;

>> yTrain = yTrain+nu*randn(ntrain,1); %add white noise to training data


3.Visualise the training and test data in a figure

>> figure(1)

>> plot(xTest, yTest, 'k-') %plot the test points as a black curve

>> hold on                  %prevent plot overwriting

>> plot(xTrain, yTrain, 'b+') %plot the training points as blue +

>> hold off %recover plot overwriting

>> legend('target function', 'observations')%labels for each curve


4.Create an Matlab function (m file) that defines an ANN and trains it using the data set generated above. You will pass the training and test data as arguments to the function, which will define the ANN structure, perform training and plot the results.

function ANN(ntrain,ntest,net,xTrain,xTest,yTrain,yTest)

%---------------------------------------------------------%

nunit=4; %number of neurons in the hidden layer

net=feedforwardnet(nunit); %define a 1 hidden layer ANN with nunit neurons

net.divideParam.trainRatio = 80/100; %use 80% of yTrain for training

net.divideParam.valRatio = 20/100; %use 80% of yTrain for validation

net.divideParam.testRatio = 0/100; %use 0% of yTrain for testing

net = train(net, xTrain', yTrain'); %fit the ANN to training data

yPrediction = net(xTest'); %prediction given test inputs

%---------------------------------------------------------%

figure(2)

plot(xTest, yTest, 'k-')

hold on

plot(xTrain, yTrain, 'b+')

plot(xTest, yPrediction, 'r-.')

hold off

legend('target function', 'observations', 'learned function')


5.Run the function in the command window

>> ANN(ntrain,ntest,xTrain,xTest,yTrain,yTest)

A window will appear to provide information as the ANN is trained. When finished it will look as follows


Click on the ‘Performance’ button to see the model performance as a function of epoch. You will see at what point the training ends.


6.Run the function again (with the same training and test data) several times and observe the results in figure(2). Save the figure corresponding to the ANN with the best performance.


7.Now repeat steps 5. and 6. using nunit=8, nunit=16, nunit=32.


8.Now repeat steps 2. to 6.  for ntrain = 128 and ntrain = 256 using nunit = 8 and nunit = 16 (i.e., four different cases)


PART D: CLASSIFICATION VIA ANN

1.Clear all variables or restart MATLAB.


2.Load the wine data from MATLAB. The wine data contains 178 samples of wine. Each sample has 13 features (used as inputs) and a class label using one shot.

>> [a,b] = wine_dataset; %a is the input data, b is the target data (class), 178 cases

>> ntotal=size(a,2); %total number of data points (178)

>> ntrain=16; %number of data points for training/validation

>> ntest=ntotal-ntrain; %number of data points for testing

>> indices=randperm(ntotal); %randomly mix the data

>> indices1=indices(1:ntrain); %indices for training/validation

>> indices2=indices(ntrain+1:ntotal); %indices for testing

>> x=a(:,indices1); %training/validation inputs

>> t=b(:,indices1); %training/validation targets (1-of-3 format)

>> xtest=a(:,indices2); %test inputs

>> ttest=b(:,indices2); %test targets (1-of-3 format)

>> net = patternnet(10); %ANN with one hidden layer and 10 hidden units

>> net.divideParam.trainRatio = 80/100; %use 80% of t for training

>> net.divideParam.valRatio = 20/100; %use 20% of t for validation

>> net.divideParam.testRatio = 0/100; %use 0% of t for testing

>> [net,tr] = train(net,x,t); %train the ANN


3.Predict classes for xtest using the trained network. This involves predicting the probabilities of each class as the output of the associated output layer neuron. Then you must convert this to a 1-of-3 representation. Finally, compare this with the test targets ttest.

>> yPrediction = net(xtest); %probabilities of each class for xtest (3 output units)

>> [~,id]=max(yPrediction); %extract location of highest probability

>> yPr=full(sparse(id,1:ntest,1)); %1-of-3 format for predictions

>> yPr-ttest %subtract prediction from test targets to locate the misclassified points (errors)


Make a note of how many points are misclassified.

4.Repeat steps 2. and 3. for ntrain=32, ntrain=64 and ntrain=128 each time making a note of how many points are misclassified.


相关文章

【上一篇】:到头了
【下一篇】:没有了

版权所有:编程辅导网 2021 All Rights Reserved 联系方式:QQ:99515681 微信:codinghelp 电子信箱:99515681@qq.com
免责声明:本站部分内容从网络整理而来,只供参考!如有版权问题可联系本站删除。 站长地图

python代写
微信客服:codinghelp