Implement Gradient Descend:

"

Gradient descent is a first-order iterative optimization algorithm for finding the minimum of a function. To find a local minimum of a function usinggradient descent, one takes steps proportional to the negative of the gradient (or approximategradient) of the function at the current point.

Gradient descent - Wikipedia


https://en.wikipedia.org/wiki/Gradient_descent

"

Gradient Descend

# From calculation, it is expected that the local minimum occurs at x=9/4

"""

cur_x = 6 # The algorithm starts at x=6

gamma = 0.01 # step size multiplier

precision = 0.00001

previous_step_size = 1

max_iters = 10000 # maximum number of iterations

iters = 0 #iteration counter

df = lambda x: 4 * x**3 - 9 * x**2

while previous_step_size > precision and iters < max_iters:

prev_x = cur_x

cur_x -= gamma * df(prev_x)

previous_step_size = abs(cur_x - prev_x)

iters+=1

print("The local minimum occurs at", cur_x)

#The output for the above will be: ('The local minimum occurs at', 2.2499646074278457)

"""

#----

print('my part')

co_ef = 6

iter = 0

max_iter = 1000

gamma = 0.001

step = 1

precision = 0.0000001

df = lambda x: 4 * x * x * x - 9 * x * x

while (iter <= max_iter) or (step >= precision ) :

prev_co_ef = co_ef

co_ef -= gamma * df (prev_co_ef)

step = abs (prev_co_ef - co_ef)

print(co_ef)

Sayed Ahmed
sayedum

Linkedin: https://ca.linkedin.com/in/sayedjustetc

Blog: http://sitestree.com, http://bangla.salearningschool.com