coursera machine learning Linear Regression octave编程作业

1.octave提交作业
1.1下载octave
1.1.1网盘下载

链接:https://pan.baidu.com/s/1of7sWiqovaKBBRFGEihUmQ
提取码:8d9u

1.1.2官网下载
http://ftp.gnu.org/gnu/octave/windows/

1.1.3 下载matlab 算了 太大 我不想。。。

1.2打开和运行octave
如果你的文件没有exe运行的这个文件,可以打开octave.vbs
coursera machine learning Linear Regression octave编程作业

1.2.1更改路径
最简单的办法,将官网下载的文件machine-learning-ex1复制在上面octave所在的文件夹
其他添加路径的参考方式:
p = ‘C:\Users\limengyuan\Desktop\machine-learning-ex1’
addpath§

1.2.2其他配置
有一些函数,官网下载的配置文件(machine-learning-ex1)里面,都是有的,一定记得下载
coursera machine learning Linear Regression octave编程作业

coursera machine learning Linear Regression octave编程作业

1.2.3运行代码
代码可以参考
https://blog.csdn.net/goddywu/article/details/100220646
或者
https://www.jianshu.com/p/9066919072d4

bug⚠️ findstr is obsolete; use strfind instead
原因 用pause()函数无法响应按键事件,详见 https://www.mobibrw.com/2019/18501 目前运行只能把ex1.m的pause;逐个注释掉

复制大佬的代码,粘贴,输入submit()即可

大佬们的代码样例:
warmUpExercise.m :

function A = warmUpExercise()
%WARMUPEXERCISE Example function in octave
%   A = WARMUPEXERCISE() is an example function that returns the 5x5 identity matrix

A = [];
% ============= YOUR CODE HERE ==============
% Instructions: Return the 5x5 identity matrix 
%               In octave, we return values by defining which variables
%               represent the return values (at the top of the file)
%               and then set them accordingly. 
A = eye(5);
% ===========================================
end

computeCost.m

function J = computeCost(X, y, theta)
%COMPUTECOST Compute cost for linear regression
%   J = COMPUTECOST(X, y, theta) computes the cost of using theta as the
%   parameter for linear regression to fit the data points in X and y

% Initialize some useful values
m = length(y); % number of training examples

% You need to return the following variables correctly 
J = 0;

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta
%               You should set J to the cost.

h_theta = X*theta;
J = 1/2/m * sum((h_theta-y).^2);

% =========================================================================

end

gradientDescent.m

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
%GRADIENTDESCENT Performs gradient descent to learn theta
%   theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by 
%   taking num_iters gradient steps with learning rate alpha

% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);

for iter = 1:num_iters

    % ====================== YOUR CODE HERE ======================
    % Instructions: Perform a single gradient step on the parameter vector
    %               theta. 
    %
    % Hint: While debugging, it can be useful to print out the values
    %       of the cost function (computeCost) and gradient here.
    %

    theta = theta - alpha*(1/m)*X'*(X*theta-y);

    % ============================================================

    % Save the cost J in every iteration    
    J_history(iter) = computeCost(X, y, theta);

end
end
end
submit()

1.2.3提交作业

submit()
warning: addpath: ./lib: No such file or directory
warning: called from
submit at line 2 column 3
warning: addpath: ./lib/jsonlab: No such file or directory
warning: called from
submitWithConfiguration at line 2 column 3
submit at line 45 column 3
== Submitting solutions | Linear Regression with Multiple Variables…
Use token from last successful submission (thousand@thu.edu.cn)? (Y/n): Y
==
== Part Name | Score | Feedback
== --------- | ----- | --------
== Warm-up Exercise | 10 / 10 | Nice work!
== Computing Cost (for One Variable) | 40 / 40 | Nice work!
== Gradient Descent (for One Variable) | 50 / 50 | Nice work!
== Feature Normalization | 0 / 0 |
== Computing Cost (for Multiple Variables) | 0 / 0 |
== Gradient Descent (for Multiple Variables) | 0 / 0 |
== Normal Equations | 0 / 0 |
== --------------------------------
== | 100 / 100 |
==

抄袭作业本身是一件可耻的事情,自己摸索固然会收获更多,但有时真的遇到octave这种冷门的东西,又必须提交作业确实挺烦的。
可以说是开源,也可以说是偷奸耍滑,有一些违背Andrew Ng老师的coursera的初衷…算了 我没有多么高尚,让大家少走一些不必要的弯路吧~

祝学习愉快!

上一篇:PaddlePaddle-basic-tutorial


下一篇:HMM/CRF