Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Lære Challenge: Fitting a Line with Gradient Descent | Mathematical Analysis
Mathematics for Data Science

bookChallenge: Fitting a Line with Gradient Descent

A student is exploring how to use gradient descent to fit a straight line to a small dataset. The dataset shows years of experience versus salary (in thousands), and the goal is to find the best-fitting line using an iterative update rule.

Your task is to adjust the slope (mm**) and intercept (bb) so that the line closely follows the data points.

The expression you are trying to minimize is:

1ni=1n(yi(mxi+b))2\frac{1}{n}\sum^n_{i=1}(y_i - (mx_i + b))^2

The gradient descent update rules for minimizing this function are:

mmαJmbbαJbm \larr m - \alpha \frac{\partial J}{\partial m} \\[6 pt] b \larr b - \alpha \frac{\partial J}{\partial b}

Where:

  • α\alpha is the learning rate (step size);
  • Jm\frac{\partial J}{\partial m} is the partial derivative of the loss function with respect to mm;
  • Jb\frac{\partial J}{\partial b} is the partial derivative of the loss function with respect to bb.

This loss measures how far off your predicted points are from the actual data. (P.S. Smaller values mean the line fits the data better.)

In order to find values mm and bb, use gradient descent.

Opgave

Swipe to start coding

  1. Complete the Python code below to implement the gradient descent steps.
  2. Fill in the missing expressions using basic Python operations.
  3. Track how the values of m and b change as the algorithm runs.

Løsning

Var alt klart?

Hvordan kan vi forbedre det?

Tak for dine kommentarer!

Sektion 3. Kapitel 11
single

single

Spørg AI

expand

Spørg AI

ChatGPT

Spørg om hvad som helst eller prøv et af de foreslåede spørgsmål for at starte vores chat

Suggested prompts:

What are the formulas for the partial derivatives with respect to m and b?

Can you explain how to choose a good learning rate (α)?

Can you walk me through an example of one iteration of gradient descent for this problem?

close

Awesome!

Completion rate improved to 1.89

bookChallenge: Fitting a Line with Gradient Descent

Stryg for at vise menuen

A student is exploring how to use gradient descent to fit a straight line to a small dataset. The dataset shows years of experience versus salary (in thousands), and the goal is to find the best-fitting line using an iterative update rule.

Your task is to adjust the slope (mm**) and intercept (bb) so that the line closely follows the data points.

The expression you are trying to minimize is:

1ni=1n(yi(mxi+b))2\frac{1}{n}\sum^n_{i=1}(y_i - (mx_i + b))^2

The gradient descent update rules for minimizing this function are:

mmαJmbbαJbm \larr m - \alpha \frac{\partial J}{\partial m} \\[6 pt] b \larr b - \alpha \frac{\partial J}{\partial b}

Where:

  • α\alpha is the learning rate (step size);
  • Jm\frac{\partial J}{\partial m} is the partial derivative of the loss function with respect to mm;
  • Jb\frac{\partial J}{\partial b} is the partial derivative of the loss function with respect to bb.

This loss measures how far off your predicted points are from the actual data. (P.S. Smaller values mean the line fits the data better.)

In order to find values mm and bb, use gradient descent.

Opgave

Swipe to start coding

  1. Complete the Python code below to implement the gradient descent steps.
  2. Fill in the missing expressions using basic Python operations.
  3. Track how the values of m and b change as the algorithm runs.

Løsning

Switch to desktopSkift til skrivebord for at øve i den virkelige verdenFortsæt der, hvor du er, med en af nedenstående muligheder
Var alt klart?

Hvordan kan vi forbedre det?

Tak for dine kommentarer!

close

Awesome!

Completion rate improved to 1.89
Sektion 3. Kapitel 11
single

single

some-alt