SlideShare a Scribd company logo
1 of 12
Download to read offline
9/17/2019 ML_Activity_1 - Jupyter Notebook
localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 1/12
Linear Regression for Years of experience and Salary
Dataset
Gradient Descent in Linear Regression
In linear regression, the model targets to get the best-fit regression line to predict the value of y based on the
given input value (x). While training the model, the model calculates the cost function which measures the Root
Mean Squared error between the predicted value (pred) and true value (y). The model targets to minimize the
cost function. To minimize the cost function, the model needs to have the best value of ฮธ1 and ฮธ2. Initially model
selects ฮธ1 and ฮธ2 values randomly and then itertively update these value in order to minimize the cost function
untill it reaches the minimum. By the time model achieves the minimum cost function, it will have the best ฮธ1
and ฮธ2 values. Using these finally updated values of ฮธ1 and ฮธ2 in the hypothesis equation of linear equation,
model predicts the value of x in the best manner it can. Therefore, the question arises โ€“ How ฮธ1 and ฮธ2 values
get updated ?
9/17/2019 ML_Activity_1 - Jupyter Notebook
localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 2/12
9/17/2019 ML_Activity_1 - Jupyter Notebook
localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 3/12
-> ฮธj : Weights of the hypothesis. -> hฮธ(xi) : predicted y value for ith input. -> j : Feature index number (can be 0,
1, 2, ......, n). -> ฮฑ : Learning Rate of Gradient Descent. We graph cost function as a function of parameter
estimates i.e. parameter range of our hypothesis function and the cost resulting from selecting a particular set
of parameters. We move downward towards pits in the graph, to find the minimum value. Way to do this is
taking derivative of cost function as explained in the above figure. Gradient Descent step downs the cost
function in the direction of the steepest descent. Size of each step is determined by parameter ฮฑ known as
Learning Rate.
In the Gradient Descent algorithm, one can infer two points :
1.If slope is +ve : ฮธj = ฮธj โ€“ (+ve value). Hence value of ฮธj decreases.
9/17/2019 ML_Activity_1 - Jupyter Notebook
localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 4/12
2. If slope is -ve : ฮธj = ฮธj โ€“ (-ve value). Hence value of ฮธj increases.
The choice of correct learning rate is very important as it ensures that Gradient Descent converges in a
reasonable time. :
1.If we choose ฮฑ to be very large, Gradient Descent can overshoot the minimum.
It may fail to converge or even diverge.
2.If we choose ฮฑ to be very small, Gradient Descent will take small steps to reach
local minima and will take a longer time to reach minima.
For linear regression Cost Function graph is always convex shaped.
Reference: geeksforgeeks.org/gradient-descent-in-linear-regression/ (Refered
For Theory)
9/17/2019 ML_Activity_1 - Jupyter Notebook
localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 5/12
First, we import a few libraries-
In [10]:
Data Preprocessing & importing Dataset
The next step is to import our dataset โ€˜Salary_Data.csvโ€™ then split them into input (independent) variables and
output (dependent) variable. When you deal with real datasets, you usually have around thousands of rows but
since the one I have taken here is a sample, this has just 30 rows. So when we split our data into a training set
and a testing set, we split it in 1/3, i.e., 20 rows go into the training set and the rest 10 make it to the testing set.
In [11]:
Plotting Default Dataset
Out[11]:
YearsExperience Salary
0 1.1 39343.0
1 1.3 46205.0
2 1.5 37731.0
3 2.0 43525.0
4 2.2 39891.0
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
dataset = pd.read_csv('Salary_Data.csv')
x = dataset.iloc[:, :-1].values
y = dataset.iloc[:, 1].values
data_top = dataset.head() #Dataset display
data_top
9/17/2019 ML_Activity_1 - Jupyter Notebook
localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 6/12
In [12]:
Training & Split
Split arrays or matrices into random train and test subsets
In [13]:
Linear Regression
Now, we will import the linear regression class, create an object of that class, which is the linear regression
model.
In [14]:
Fitting Data
Then we will use the fit method to โ€œfitโ€ the model to our dataset. What this does is nothing but make the
regressor โ€œstudyโ€ our data and โ€œlearnโ€ from it.
plt.scatter(x, y, color = "red")
plt.plot(x,y, color = "green")
plt.title("Salary vs Experience (Dataset)")
plt.xlabel("Years of Experience")
plt.ylabel("Salary")
plt.show()
from sklearn.model_selection import train_test_split
x_train, x_test, y_train, y_test = train_test_split(x, y, test_size = 1/3)
from sklearn.linear_model import LinearRegression
lr = LinearRegression()
9/17/2019 ML_Activity_1 - Jupyter Notebook
localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 7/12
In [15]:
Testing
Now that we have created our model and trained it, it is time we test the model with our testing dataset.
In [16]:
Data Visualization for Training Dataset
First, we make use of a scatter plot to plot the actual observations, with x_train on the x-axis and y_train on the
y-axis. For the regression line, we will use x_train on the x-axis and then the predictions of the x_train
observations on the y-axis. We add a touch of aesthetics by coloring the original observations in red and the
regression line in green.
In [17]:
Data Visualization for Testing Dataset
repeat the same task for our testing dataset, and we get the following code-
Out[15]:
LinearRegression(copy_X=True, fit_intercept=True, n_jobs=None, normalize=Fal
se)
lr.fit(x_train, y_train)
y_pred = lr.predict(x_test)
plt.scatter(x_train, y_train, color = "red")
plt.plot(x_train, lr.predict(x_train), color = "green")
plt.title("Salary vs Experience (Training set)")
plt.xlabel("Years of Experience")
plt.ylabel("Salary")
plt.show()
9/17/2019 ML_Activity_1 - Jupyter Notebook
localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 8/12
In [24]:
Linear Regression with Gradient Descent Algorithm
Approach for Years of experience and Salary
Dataset
Importing libraries,Dataset & data preprocessing
plt.scatter(x_test, y_test, color = "red")
plt.plot(x_train, lr.predict(x_train), color = "green")
plt.title("Salary vs Experience (Testing set)")
plt.xlabel("Years of Experience")
plt.ylabel("Salary")
plt.show()
9/17/2019 ML_Activity_1 - Jupyter Notebook
localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 9/12
In [27]:
# Making the imports
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
plt.rcParams['figure.figsize'] = (12.0, 9.0)
# Preprocessing Input data
data = pd.read_csv('Salary_Data.csv')
X = data.iloc[:, 0]
Y = data.iloc[:, 1]
#Plotting Data for visualization
plt.scatter(X, Y)
plt.title("Salary vs Experience (Dataset set)")
plt.xlabel("Years of Experience")
plt.ylabel("Salary")
plt.show()
9/17/2019 ML_Activity_1 - Jupyter Notebook
localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 10/12
Optimizing parameter value like intercept "c" and slope "m" & learning rate
alpha in Gradient descent formula
In [26]:
Prediction
Slope & Intercept:
12836.600965885045 2915.2044856014018
# Building the model
m = 0
c = 0
L = 0.0001 # The learning Rate
#L = 0.0001 # The learning Rate
#L = 0.0002 # The learning Rate
#L = 0.0003 # The learning Rate
epochs = 1000 # The number of iterations to perform gradient descent
n = float(len(X)) # Number of elements in X
# Performing Gradient Descent
for i in range(epochs):
Y_pred = m*X + c # The current predicted value of Y
D_m = (-2/n) * sum(X * (Y - Y_pred)) # Derivative wrt m
D_c = (-2/n) * sum(Y - Y_pred) # Derivative wrt c
m = m - L * D_m # Update m
c = c - L * D_c # Update c
print("Slope & Intercept:")
print (m, c)
9/17/2019 ML_Activity_1 - Jupyter Notebook
localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 11/12
In [28]:
# Making predictions
Y_pred = m*X + c
plt.scatter(X, Y)
plt.plot([min(X), max(X)], [min(Y_pred), max(Y_pred)], color='red') # regression line
plt.title("Salary vs Experience (prediction)")
plt.xlabel("Years of Experience")
plt.ylabel("Salary")
plt.show()
9/17/2019 ML_Activity_1 - Jupyter Notebook
localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 12/12
Batch 1 Block 1 B.TECH ENTC
Omkar Rane BETB118
Kaustubh Wankhade BETB129

More Related Content

What's hot

Introduction to MATLAB
Introduction to MATLABIntroduction to MATLAB
Introduction to MATLABBhavesh Shah
ย 
Regression_1.pdf
Regression_1.pdfRegression_1.pdf
Regression_1.pdfAmir Saleh
ย 
Computing Information Flow Using Symbolic-Model-Checking_.pdf
Computing Information Flow Using Symbolic-Model-Checking_.pdfComputing Information Flow Using Symbolic-Model-Checking_.pdf
Computing Information Flow Using Symbolic-Model-Checking_.pdfPolytechnique Montrรฉal
ย 
Matlab Nn Intro
Matlab Nn IntroMatlab Nn Intro
Matlab Nn IntroImthias Ahamed
ย 
09. amortized analysis
09. amortized analysis09. amortized analysis
09. amortized analysisOnkar Nath Sharma
ย 
Amortized Analysis of Algorithms
Amortized Analysis of Algorithms Amortized Analysis of Algorithms
Amortized Analysis of Algorithms sathish sak
ย 
๏ฟผBoosted Tree-based Multinomial Logit Model for Aggregated Market Data
๏ฟผBoosted Tree-based Multinomial Logit Model for Aggregated Market Data๏ฟผBoosted Tree-based Multinomial Logit Model for Aggregated Market Data
๏ฟผBoosted Tree-based Multinomial Logit Model for Aggregated Market DataJay (Jianqiang) Wang
ย 
4. linear programming using excel solver
4. linear programming using excel solver4. linear programming using excel solver
4. linear programming using excel solverHakeem-Ur- Rehman
ย 
Active contour segmentation
Active contour segmentationActive contour segmentation
Active contour segmentationNishant Jain
ย 
Aoa amortized analysis
Aoa amortized analysisAoa amortized analysis
Aoa amortized analysisSalabat Khan
ย 
Linear models
Linear modelsLinear models
Linear modelsFAO
ย 
Support Vector Machines (SVM)
Support Vector Machines (SVM)Support Vector Machines (SVM)
Support Vector Machines (SVM)FAO
ย 
Biosight: Quantitative Methods for Policy Analysis - Introduction to GAMS, Li...
Biosight: Quantitative Methods for Policy Analysis - Introduction to GAMS, Li...Biosight: Quantitative Methods for Policy Analysis - Introduction to GAMS, Li...
Biosight: Quantitative Methods for Policy Analysis - Introduction to GAMS, Li...IFPRI-EPTD
ย 
maXbox starter65 machinelearning3
maXbox starter65 machinelearning3maXbox starter65 machinelearning3
maXbox starter65 machinelearning3Max Kleiner
ย 
Importance of matlab
Importance of matlabImportance of matlab
Importance of matlabkrajeshk1980
ย 
Amortized analysis
Amortized analysisAmortized analysis
Amortized analysisajmalcs
ย 
Amortized
AmortizedAmortized
Amortized8neutron8
ย 
Introducton to Convolutional Nerural Network with TensorFlow
Introducton to Convolutional Nerural Network with TensorFlowIntroducton to Convolutional Nerural Network with TensorFlow
Introducton to Convolutional Nerural Network with TensorFlowEtsuji Nakai
ย 

What's hot (20)

Introduction to MATLAB
Introduction to MATLABIntroduction to MATLAB
Introduction to MATLAB
ย 
Regression_1.pdf
Regression_1.pdfRegression_1.pdf
Regression_1.pdf
ย 
Computing Information Flow Using Symbolic-Model-Checking_.pdf
Computing Information Flow Using Symbolic-Model-Checking_.pdfComputing Information Flow Using Symbolic-Model-Checking_.pdf
Computing Information Flow Using Symbolic-Model-Checking_.pdf
ย 
Matlab Nn Intro
Matlab Nn IntroMatlab Nn Intro
Matlab Nn Intro
ย 
09. amortized analysis
09. amortized analysis09. amortized analysis
09. amortized analysis
ย 
Amortized Analysis of Algorithms
Amortized Analysis of Algorithms Amortized Analysis of Algorithms
Amortized Analysis of Algorithms
ย 
๏ฟผBoosted Tree-based Multinomial Logit Model for Aggregated Market Data
๏ฟผBoosted Tree-based Multinomial Logit Model for Aggregated Market Data๏ฟผBoosted Tree-based Multinomial Logit Model for Aggregated Market Data
๏ฟผBoosted Tree-based Multinomial Logit Model for Aggregated Market Data
ย 
13 Amortized Analysis
13 Amortized Analysis13 Amortized Analysis
13 Amortized Analysis
ย 
4. linear programming using excel solver
4. linear programming using excel solver4. linear programming using excel solver
4. linear programming using excel solver
ย 
Active contour segmentation
Active contour segmentationActive contour segmentation
Active contour segmentation
ย 
Aoa amortized analysis
Aoa amortized analysisAoa amortized analysis
Aoa amortized analysis
ย 
Linear models
Linear modelsLinear models
Linear models
ย 
Support Vector Machines (SVM)
Support Vector Machines (SVM)Support Vector Machines (SVM)
Support Vector Machines (SVM)
ย 
Biosight: Quantitative Methods for Policy Analysis - Introduction to GAMS, Li...
Biosight: Quantitative Methods for Policy Analysis - Introduction to GAMS, Li...Biosight: Quantitative Methods for Policy Analysis - Introduction to GAMS, Li...
Biosight: Quantitative Methods for Policy Analysis - Introduction to GAMS, Li...
ย 
maXbox starter65 machinelearning3
maXbox starter65 machinelearning3maXbox starter65 machinelearning3
maXbox starter65 machinelearning3
ย 
Importance of matlab
Importance of matlabImportance of matlab
Importance of matlab
ย 
Amortized analysis
Amortized analysisAmortized analysis
Amortized analysis
ย 
Amortized
AmortizedAmortized
Amortized
ย 
Introducton to Convolutional Nerural Network with TensorFlow
Introducton to Convolutional Nerural Network with TensorFlowIntroducton to Convolutional Nerural Network with TensorFlow
Introducton to Convolutional Nerural Network with TensorFlow
ย 
MUMS: Transition & SPUQ Workshop - Practical Bayesian Optimization for Urban ...
MUMS: Transition & SPUQ Workshop - Practical Bayesian Optimization for Urban ...MUMS: Transition & SPUQ Workshop - Practical Bayesian Optimization for Urban ...
MUMS: Transition & SPUQ Workshop - Practical Bayesian Optimization for Urban ...
ย 

Similar to Linear Regression (Machine Learning)

working with python
working with pythonworking with python
working with pythonbhavesh lande
ย 
maXbox starter69 Machine Learning VII
maXbox starter69 Machine Learning VIImaXbox starter69 Machine Learning VII
maXbox starter69 Machine Learning VIIMax Kleiner
ย 
Mini-lab 1: Stochastic Gradient Descent classifier, Optimizing Logistic Regre...
Mini-lab 1: Stochastic Gradient Descent classifier, Optimizing Logistic Regre...Mini-lab 1: Stochastic Gradient Descent classifier, Optimizing Logistic Regre...
Mini-lab 1: Stochastic Gradient Descent classifier, Optimizing Logistic Regre...Yao Yao
ย 
Competition 1 (blog 1)
Competition 1 (blog 1)Competition 1 (blog 1)
Competition 1 (blog 1)TarunPaparaju
ย 
Introductio2.docx
Introductio2.docxIntroductio2.docx
Introductio2.docxhoneyarguelles
ย 
Machine Learning Guide maXbox Starter62
Machine Learning Guide maXbox Starter62Machine Learning Guide maXbox Starter62
Machine Learning Guide maXbox Starter62Max Kleiner
ย 
Lab 2: Classification and Regression Prediction Models, training and testing ...
Lab 2: Classification and Regression Prediction Models, training and testing ...Lab 2: Classification and Regression Prediction Models, training and testing ...
Lab 2: Classification and Regression Prediction Models, training and testing ...Yao Yao
ย 
Assignment 6.1.pdf
Assignment 6.1.pdfAssignment 6.1.pdf
Assignment 6.1.pdfdash41
ย 
support vector regression
support vector regressionsupport vector regression
support vector regressionAkhilesh Joshi
ย 
BPstudy sklearn 20180925
BPstudy sklearn 20180925BPstudy sklearn 20180925
BPstudy sklearn 20180925Shintaro Fukushima
ย 
Basic Analysis using Python
Basic Analysis using PythonBasic Analysis using Python
Basic Analysis using PythonSankhya_Analytics
ย 
3ml.pdf
3ml.pdf3ml.pdf
3ml.pdfMianAdnan27
ย 
Chapter 1: Linear Regression
Chapter 1: Linear RegressionChapter 1: Linear Regression
Chapter 1: Linear RegressionAkmelSyed
ย 
AIML4 CNN lab256 1hr (111-1).pdf
AIML4 CNN lab256 1hr (111-1).pdfAIML4 CNN lab256 1hr (111-1).pdf
AIML4 CNN lab256 1hr (111-1).pdfssuserb4d806
ย 
Essentials of machine learning algorithms
Essentials of machine learning algorithmsEssentials of machine learning algorithms
Essentials of machine learning algorithmsArunangsu Sahu
ย 
Multinomial Logistic Regression with Apache Spark
Multinomial Logistic Regression with Apache SparkMultinomial Logistic Regression with Apache Spark
Multinomial Logistic Regression with Apache SparkDB Tsai
ย 

Similar to Linear Regression (Machine Learning) (20)

working with python
working with pythonworking with python
working with python
ย 
maXbox starter69 Machine Learning VII
maXbox starter69 Machine Learning VIImaXbox starter69 Machine Learning VII
maXbox starter69 Machine Learning VII
ย 
Mini-lab 1: Stochastic Gradient Descent classifier, Optimizing Logistic Regre...
Mini-lab 1: Stochastic Gradient Descent classifier, Optimizing Logistic Regre...Mini-lab 1: Stochastic Gradient Descent classifier, Optimizing Logistic Regre...
Mini-lab 1: Stochastic Gradient Descent classifier, Optimizing Logistic Regre...
ย 
Competition 1 (blog 1)
Competition 1 (blog 1)Competition 1 (blog 1)
Competition 1 (blog 1)
ย 
Introductio2.docx
Introductio2.docxIntroductio2.docx
Introductio2.docx
ย 
Kaggle KDD Cup Report
Kaggle KDD Cup ReportKaggle KDD Cup Report
Kaggle KDD Cup Report
ย 
Machine Learning Guide maXbox Starter62
Machine Learning Guide maXbox Starter62Machine Learning Guide maXbox Starter62
Machine Learning Guide maXbox Starter62
ย 
Lab 2: Classification and Regression Prediction Models, training and testing ...
Lab 2: Classification and Regression Prediction Models, training and testing ...Lab 2: Classification and Regression Prediction Models, training and testing ...
Lab 2: Classification and Regression Prediction Models, training and testing ...
ย 
Assignment 6.1.pdf
Assignment 6.1.pdfAssignment 6.1.pdf
Assignment 6.1.pdf
ย 
support vector regression
support vector regressionsupport vector regression
support vector regression
ย 
Xgboost
XgboostXgboost
Xgboost
ย 
ML .pptx
ML .pptxML .pptx
ML .pptx
ย 
BPstudy sklearn 20180925
BPstudy sklearn 20180925BPstudy sklearn 20180925
BPstudy sklearn 20180925
ย 
Basic Analysis using Python
Basic Analysis using PythonBasic Analysis using Python
Basic Analysis using Python
ย 
3ml.pdf
3ml.pdf3ml.pdf
3ml.pdf
ย 
Xgboost
XgboostXgboost
Xgboost
ย 
Chapter 1: Linear Regression
Chapter 1: Linear RegressionChapter 1: Linear Regression
Chapter 1: Linear Regression
ย 
AIML4 CNN lab256 1hr (111-1).pdf
AIML4 CNN lab256 1hr (111-1).pdfAIML4 CNN lab256 1hr (111-1).pdf
AIML4 CNN lab256 1hr (111-1).pdf
ย 
Essentials of machine learning algorithms
Essentials of machine learning algorithmsEssentials of machine learning algorithms
Essentials of machine learning algorithms
ย 
Multinomial Logistic Regression with Apache Spark
Multinomial Logistic Regression with Apache SparkMultinomial Logistic Regression with Apache Spark
Multinomial Logistic Regression with Apache Spark
ย 

More from Omkar Rane

Enabling SSL Elasticsearch on server
Enabling SSL Elasticsearch on serverEnabling SSL Elasticsearch on server
Enabling SSL Elasticsearch on serverOmkar Rane
ย 
Anti lock braking (ABS) Model based Design in MATLAB-Simulink
Anti lock braking (ABS) Model based Design in MATLAB-SimulinkAnti lock braking (ABS) Model based Design in MATLAB-Simulink
Anti lock braking (ABS) Model based Design in MATLAB-SimulinkOmkar Rane
ย 
Autosar fundamental
Autosar fundamentalAutosar fundamental
Autosar fundamentalOmkar Rane
ย 
Stress Management
Stress ManagementStress Management
Stress ManagementOmkar Rane
ย 
Bootloaders (U-Boot)
Bootloaders (U-Boot) Bootloaders (U-Boot)
Bootloaders (U-Boot) Omkar Rane
ย 
Concept of Diversity & Fading (wireless communication)
Concept of Diversity & Fading (wireless communication)Concept of Diversity & Fading (wireless communication)
Concept of Diversity & Fading (wireless communication)Omkar Rane
ย 
Tata Motors GDC .LTD Internship
Tata Motors GDC .LTD Internship Tata Motors GDC .LTD Internship
Tata Motors GDC .LTD Internship Omkar Rane
ย 
Machine Learning Model for M.S admissions
Machine Learning Model for M.S admissionsMachine Learning Model for M.S admissions
Machine Learning Model for M.S admissionsOmkar Rane
ย 
Timer 0 programming on LPC 1768
Timer 0 programming on LPC 1768Timer 0 programming on LPC 1768
Timer 0 programming on LPC 1768Omkar Rane
ย 
ADC (Analog to Digital conversion) using LPC 1768
ADC (Analog to Digital conversion) using LPC 1768ADC (Analog to Digital conversion) using LPC 1768
ADC (Analog to Digital conversion) using LPC 1768Omkar Rane
ย 
PWM based motor speed control using LPC 1768
PWM based motor speed control using LPC 1768PWM based motor speed control using LPC 1768
PWM based motor speed control using LPC 1768Omkar Rane
ย 
UART interfacing on LPC1768 (Cortex M3 micro controller)
UART interfacing on LPC1768 (Cortex M3 micro controller)UART interfacing on LPC1768 (Cortex M3 micro controller)
UART interfacing on LPC1768 (Cortex M3 micro controller)Omkar Rane
ย 
LED Blinking logic on LPC1768
LED Blinking logic on LPC1768LED Blinking logic on LPC1768
LED Blinking logic on LPC1768Omkar Rane
ย 
CAN interfacing on LPC1768 (ARM Cortex M3 based Micro controller)
CAN interfacing on LPC1768 (ARM Cortex M3 based Micro controller)CAN interfacing on LPC1768 (ARM Cortex M3 based Micro controller)
CAN interfacing on LPC1768 (ARM Cortex M3 based Micro controller)Omkar Rane
ย 
Vlisi Course project presentation:Keypad Scanner
Vlisi Course project presentation:Keypad ScannerVlisi Course project presentation:Keypad Scanner
Vlisi Course project presentation:Keypad ScannerOmkar Rane
ย 
VlSI course project report : Keypad Scanner
VlSI course project report : Keypad Scanner VlSI course project report : Keypad Scanner
VlSI course project report : Keypad Scanner Omkar Rane
ย 
LPC 1768 A study on Real Time clock features
LPC 1768 A study on Real Time clock featuresLPC 1768 A study on Real Time clock features
LPC 1768 A study on Real Time clock featuresOmkar Rane
ย 
Nexys4ddr rm FPGA board Datasheet
Nexys4ddr rm  FPGA board DatasheetNexys4ddr rm  FPGA board Datasheet
Nexys4ddr rm FPGA board DatasheetOmkar Rane
ย 
transmission gate based design for 2:1 Multiplexer in micro-wind
transmission gate based design for 2:1 Multiplexer in micro-windtransmission gate based design for 2:1 Multiplexer in micro-wind
transmission gate based design for 2:1 Multiplexer in micro-windOmkar Rane
ย 
2:1 Multiplexer using NAND gate in Microwind
2:1 Multiplexer using NAND gate in Microwind 2:1 Multiplexer using NAND gate in Microwind
2:1 Multiplexer using NAND gate in Microwind Omkar Rane
ย 

More from Omkar Rane (20)

Enabling SSL Elasticsearch on server
Enabling SSL Elasticsearch on serverEnabling SSL Elasticsearch on server
Enabling SSL Elasticsearch on server
ย 
Anti lock braking (ABS) Model based Design in MATLAB-Simulink
Anti lock braking (ABS) Model based Design in MATLAB-SimulinkAnti lock braking (ABS) Model based Design in MATLAB-Simulink
Anti lock braking (ABS) Model based Design in MATLAB-Simulink
ย 
Autosar fundamental
Autosar fundamentalAutosar fundamental
Autosar fundamental
ย 
Stress Management
Stress ManagementStress Management
Stress Management
ย 
Bootloaders (U-Boot)
Bootloaders (U-Boot) Bootloaders (U-Boot)
Bootloaders (U-Boot)
ย 
Concept of Diversity & Fading (wireless communication)
Concept of Diversity & Fading (wireless communication)Concept of Diversity & Fading (wireless communication)
Concept of Diversity & Fading (wireless communication)
ย 
Tata Motors GDC .LTD Internship
Tata Motors GDC .LTD Internship Tata Motors GDC .LTD Internship
Tata Motors GDC .LTD Internship
ย 
Machine Learning Model for M.S admissions
Machine Learning Model for M.S admissionsMachine Learning Model for M.S admissions
Machine Learning Model for M.S admissions
ย 
Timer 0 programming on LPC 1768
Timer 0 programming on LPC 1768Timer 0 programming on LPC 1768
Timer 0 programming on LPC 1768
ย 
ADC (Analog to Digital conversion) using LPC 1768
ADC (Analog to Digital conversion) using LPC 1768ADC (Analog to Digital conversion) using LPC 1768
ADC (Analog to Digital conversion) using LPC 1768
ย 
PWM based motor speed control using LPC 1768
PWM based motor speed control using LPC 1768PWM based motor speed control using LPC 1768
PWM based motor speed control using LPC 1768
ย 
UART interfacing on LPC1768 (Cortex M3 micro controller)
UART interfacing on LPC1768 (Cortex M3 micro controller)UART interfacing on LPC1768 (Cortex M3 micro controller)
UART interfacing on LPC1768 (Cortex M3 micro controller)
ย 
LED Blinking logic on LPC1768
LED Blinking logic on LPC1768LED Blinking logic on LPC1768
LED Blinking logic on LPC1768
ย 
CAN interfacing on LPC1768 (ARM Cortex M3 based Micro controller)
CAN interfacing on LPC1768 (ARM Cortex M3 based Micro controller)CAN interfacing on LPC1768 (ARM Cortex M3 based Micro controller)
CAN interfacing on LPC1768 (ARM Cortex M3 based Micro controller)
ย 
Vlisi Course project presentation:Keypad Scanner
Vlisi Course project presentation:Keypad ScannerVlisi Course project presentation:Keypad Scanner
Vlisi Course project presentation:Keypad Scanner
ย 
VlSI course project report : Keypad Scanner
VlSI course project report : Keypad Scanner VlSI course project report : Keypad Scanner
VlSI course project report : Keypad Scanner
ย 
LPC 1768 A study on Real Time clock features
LPC 1768 A study on Real Time clock featuresLPC 1768 A study on Real Time clock features
LPC 1768 A study on Real Time clock features
ย 
Nexys4ddr rm FPGA board Datasheet
Nexys4ddr rm  FPGA board DatasheetNexys4ddr rm  FPGA board Datasheet
Nexys4ddr rm FPGA board Datasheet
ย 
transmission gate based design for 2:1 Multiplexer in micro-wind
transmission gate based design for 2:1 Multiplexer in micro-windtransmission gate based design for 2:1 Multiplexer in micro-wind
transmission gate based design for 2:1 Multiplexer in micro-wind
ย 
2:1 Multiplexer using NAND gate in Microwind
2:1 Multiplexer using NAND gate in Microwind 2:1 Multiplexer using NAND gate in Microwind
2:1 Multiplexer using NAND gate in Microwind
ย 

Recently uploaded

FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced Loads
FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced LoadsFEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced Loads
FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced LoadsArindam Chakraborty, Ph.D., P.E. (CA, TX)
ย 
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...roncy bisnoi
ย 
VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...
VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...
VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...SUHANI PANDEY
ย 
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...Arindam Chakraborty, Ph.D., P.E. (CA, TX)
ย 
Unit 2- Effective stress & Permeability.pdf
Unit 2- Effective stress & Permeability.pdfUnit 2- Effective stress & Permeability.pdf
Unit 2- Effective stress & Permeability.pdfRagavanV2
ย 
Unit 1 - Soil Classification and Compaction.pdf
Unit 1 - Soil Classification and Compaction.pdfUnit 1 - Soil Classification and Compaction.pdf
Unit 1 - Soil Classification and Compaction.pdfRagavanV2
ย 
University management System project report..pdf
University management System project report..pdfUniversity management System project report..pdf
University management System project report..pdfKamal Acharya
ย 
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdfONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdfKamal Acharya
ย 
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756dollysharma2066
ย 
Design For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the startDesign For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the startQuintin Balsdon
ย 
NFPA 5000 2024 standard .
NFPA 5000 2024 standard                                  .NFPA 5000 2024 standard                                  .
NFPA 5000 2024 standard .DerechoLaboralIndivi
ย 
notes on Evolution Of Analytic Scalability.ppt
notes on Evolution Of Analytic Scalability.pptnotes on Evolution Of Analytic Scalability.ppt
notes on Evolution Of Analytic Scalability.pptMsecMca
ย 
Work-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxWork-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxJuliansyahHarahap1
ย 
Call Girls In Bangalore โ˜Ž 7737669865 ๐Ÿฅต Book Your One night Stand
Call Girls In Bangalore โ˜Ž 7737669865 ๐Ÿฅต Book Your One night StandCall Girls In Bangalore โ˜Ž 7737669865 ๐Ÿฅต Book Your One night Stand
Call Girls In Bangalore โ˜Ž 7737669865 ๐Ÿฅต Book Your One night Standamitlee9823
ย 
Call Girls in Ramesh Nagar Delhi ๐Ÿ’ฏ Call Us ๐Ÿ”9953056974 ๐Ÿ” Escort Service
Call Girls in Ramesh Nagar Delhi ๐Ÿ’ฏ Call Us ๐Ÿ”9953056974 ๐Ÿ” Escort ServiceCall Girls in Ramesh Nagar Delhi ๐Ÿ’ฏ Call Us ๐Ÿ”9953056974 ๐Ÿ” Escort Service
Call Girls in Ramesh Nagar Delhi ๐Ÿ’ฏ Call Us ๐Ÿ”9953056974 ๐Ÿ” Escort Service9953056974 Low Rate Call Girls In Saket, Delhi NCR
ย 
Double rodded leveling 1 pdf activity 01
Double rodded leveling 1 pdf activity 01Double rodded leveling 1 pdf activity 01
Double rodded leveling 1 pdf activity 01KreezheaRecto
ย 
chapter 5.pptx: drainage and irrigation engineering
chapter 5.pptx: drainage and irrigation engineeringchapter 5.pptx: drainage and irrigation engineering
chapter 5.pptx: drainage and irrigation engineeringmulugeta48
ย 
Double Revolving field theory-how the rotor develops torque
Double Revolving field theory-how the rotor develops torqueDouble Revolving field theory-how the rotor develops torque
Double Revolving field theory-how the rotor develops torqueBhangaleSonal
ย 
Block diagram reduction techniques in control systems.ppt
Block diagram reduction techniques in control systems.pptBlock diagram reduction techniques in control systems.ppt
Block diagram reduction techniques in control systems.pptNANDHAKUMARA10
ย 

Recently uploaded (20)

FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced Loads
FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced LoadsFEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced Loads
FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced Loads
ย 
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
ย 
VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...
VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...
VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...
ย 
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
ย 
Unit 2- Effective stress & Permeability.pdf
Unit 2- Effective stress & Permeability.pdfUnit 2- Effective stress & Permeability.pdf
Unit 2- Effective stress & Permeability.pdf
ย 
Unit 1 - Soil Classification and Compaction.pdf
Unit 1 - Soil Classification and Compaction.pdfUnit 1 - Soil Classification and Compaction.pdf
Unit 1 - Soil Classification and Compaction.pdf
ย 
University management System project report..pdf
University management System project report..pdfUniversity management System project report..pdf
University management System project report..pdf
ย 
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdfONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ย 
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
ย 
(INDIRA) Call Girl Bhosari Call Now 8617697112 Bhosari Escorts 24x7
(INDIRA) Call Girl Bhosari Call Now 8617697112 Bhosari Escorts 24x7(INDIRA) Call Girl Bhosari Call Now 8617697112 Bhosari Escorts 24x7
(INDIRA) Call Girl Bhosari Call Now 8617697112 Bhosari Escorts 24x7
ย 
Design For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the startDesign For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the start
ย 
NFPA 5000 2024 standard .
NFPA 5000 2024 standard                                  .NFPA 5000 2024 standard                                  .
NFPA 5000 2024 standard .
ย 
notes on Evolution Of Analytic Scalability.ppt
notes on Evolution Of Analytic Scalability.pptnotes on Evolution Of Analytic Scalability.ppt
notes on Evolution Of Analytic Scalability.ppt
ย 
Work-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxWork-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptx
ย 
Call Girls In Bangalore โ˜Ž 7737669865 ๐Ÿฅต Book Your One night Stand
Call Girls In Bangalore โ˜Ž 7737669865 ๐Ÿฅต Book Your One night StandCall Girls In Bangalore โ˜Ž 7737669865 ๐Ÿฅต Book Your One night Stand
Call Girls In Bangalore โ˜Ž 7737669865 ๐Ÿฅต Book Your One night Stand
ย 
Call Girls in Ramesh Nagar Delhi ๐Ÿ’ฏ Call Us ๐Ÿ”9953056974 ๐Ÿ” Escort Service
Call Girls in Ramesh Nagar Delhi ๐Ÿ’ฏ Call Us ๐Ÿ”9953056974 ๐Ÿ” Escort ServiceCall Girls in Ramesh Nagar Delhi ๐Ÿ’ฏ Call Us ๐Ÿ”9953056974 ๐Ÿ” Escort Service
Call Girls in Ramesh Nagar Delhi ๐Ÿ’ฏ Call Us ๐Ÿ”9953056974 ๐Ÿ” Escort Service
ย 
Double rodded leveling 1 pdf activity 01
Double rodded leveling 1 pdf activity 01Double rodded leveling 1 pdf activity 01
Double rodded leveling 1 pdf activity 01
ย 
chapter 5.pptx: drainage and irrigation engineering
chapter 5.pptx: drainage and irrigation engineeringchapter 5.pptx: drainage and irrigation engineering
chapter 5.pptx: drainage and irrigation engineering
ย 
Double Revolving field theory-how the rotor develops torque
Double Revolving field theory-how the rotor develops torqueDouble Revolving field theory-how the rotor develops torque
Double Revolving field theory-how the rotor develops torque
ย 
Block diagram reduction techniques in control systems.ppt
Block diagram reduction techniques in control systems.pptBlock diagram reduction techniques in control systems.ppt
Block diagram reduction techniques in control systems.ppt
ย 

Linear Regression (Machine Learning)

  • 1. 9/17/2019 ML_Activity_1 - Jupyter Notebook localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 1/12 Linear Regression for Years of experience and Salary Dataset Gradient Descent in Linear Regression In linear regression, the model targets to get the best-fit regression line to predict the value of y based on the given input value (x). While training the model, the model calculates the cost function which measures the Root Mean Squared error between the predicted value (pred) and true value (y). The model targets to minimize the cost function. To minimize the cost function, the model needs to have the best value of ฮธ1 and ฮธ2. Initially model selects ฮธ1 and ฮธ2 values randomly and then itertively update these value in order to minimize the cost function untill it reaches the minimum. By the time model achieves the minimum cost function, it will have the best ฮธ1 and ฮธ2 values. Using these finally updated values of ฮธ1 and ฮธ2 in the hypothesis equation of linear equation, model predicts the value of x in the best manner it can. Therefore, the question arises โ€“ How ฮธ1 and ฮธ2 values get updated ?
  • 2. 9/17/2019 ML_Activity_1 - Jupyter Notebook localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 2/12
  • 3. 9/17/2019 ML_Activity_1 - Jupyter Notebook localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 3/12 -> ฮธj : Weights of the hypothesis. -> hฮธ(xi) : predicted y value for ith input. -> j : Feature index number (can be 0, 1, 2, ......, n). -> ฮฑ : Learning Rate of Gradient Descent. We graph cost function as a function of parameter estimates i.e. parameter range of our hypothesis function and the cost resulting from selecting a particular set of parameters. We move downward towards pits in the graph, to find the minimum value. Way to do this is taking derivative of cost function as explained in the above figure. Gradient Descent step downs the cost function in the direction of the steepest descent. Size of each step is determined by parameter ฮฑ known as Learning Rate. In the Gradient Descent algorithm, one can infer two points : 1.If slope is +ve : ฮธj = ฮธj โ€“ (+ve value). Hence value of ฮธj decreases.
  • 4. 9/17/2019 ML_Activity_1 - Jupyter Notebook localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 4/12 2. If slope is -ve : ฮธj = ฮธj โ€“ (-ve value). Hence value of ฮธj increases. The choice of correct learning rate is very important as it ensures that Gradient Descent converges in a reasonable time. : 1.If we choose ฮฑ to be very large, Gradient Descent can overshoot the minimum. It may fail to converge or even diverge. 2.If we choose ฮฑ to be very small, Gradient Descent will take small steps to reach local minima and will take a longer time to reach minima. For linear regression Cost Function graph is always convex shaped. Reference: geeksforgeeks.org/gradient-descent-in-linear-regression/ (Refered For Theory)
  • 5. 9/17/2019 ML_Activity_1 - Jupyter Notebook localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 5/12 First, we import a few libraries- In [10]: Data Preprocessing & importing Dataset The next step is to import our dataset โ€˜Salary_Data.csvโ€™ then split them into input (independent) variables and output (dependent) variable. When you deal with real datasets, you usually have around thousands of rows but since the one I have taken here is a sample, this has just 30 rows. So when we split our data into a training set and a testing set, we split it in 1/3, i.e., 20 rows go into the training set and the rest 10 make it to the testing set. In [11]: Plotting Default Dataset Out[11]: YearsExperience Salary 0 1.1 39343.0 1 1.3 46205.0 2 1.5 37731.0 3 2.0 43525.0 4 2.2 39891.0 import numpy as np import pandas as pd import matplotlib.pyplot as plt dataset = pd.read_csv('Salary_Data.csv') x = dataset.iloc[:, :-1].values y = dataset.iloc[:, 1].values data_top = dataset.head() #Dataset display data_top
  • 6. 9/17/2019 ML_Activity_1 - Jupyter Notebook localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 6/12 In [12]: Training & Split Split arrays or matrices into random train and test subsets In [13]: Linear Regression Now, we will import the linear regression class, create an object of that class, which is the linear regression model. In [14]: Fitting Data Then we will use the fit method to โ€œfitโ€ the model to our dataset. What this does is nothing but make the regressor โ€œstudyโ€ our data and โ€œlearnโ€ from it. plt.scatter(x, y, color = "red") plt.plot(x,y, color = "green") plt.title("Salary vs Experience (Dataset)") plt.xlabel("Years of Experience") plt.ylabel("Salary") plt.show() from sklearn.model_selection import train_test_split x_train, x_test, y_train, y_test = train_test_split(x, y, test_size = 1/3) from sklearn.linear_model import LinearRegression lr = LinearRegression()
  • 7. 9/17/2019 ML_Activity_1 - Jupyter Notebook localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 7/12 In [15]: Testing Now that we have created our model and trained it, it is time we test the model with our testing dataset. In [16]: Data Visualization for Training Dataset First, we make use of a scatter plot to plot the actual observations, with x_train on the x-axis and y_train on the y-axis. For the regression line, we will use x_train on the x-axis and then the predictions of the x_train observations on the y-axis. We add a touch of aesthetics by coloring the original observations in red and the regression line in green. In [17]: Data Visualization for Testing Dataset repeat the same task for our testing dataset, and we get the following code- Out[15]: LinearRegression(copy_X=True, fit_intercept=True, n_jobs=None, normalize=Fal se) lr.fit(x_train, y_train) y_pred = lr.predict(x_test) plt.scatter(x_train, y_train, color = "red") plt.plot(x_train, lr.predict(x_train), color = "green") plt.title("Salary vs Experience (Training set)") plt.xlabel("Years of Experience") plt.ylabel("Salary") plt.show()
  • 8. 9/17/2019 ML_Activity_1 - Jupyter Notebook localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 8/12 In [24]: Linear Regression with Gradient Descent Algorithm Approach for Years of experience and Salary Dataset Importing libraries,Dataset & data preprocessing plt.scatter(x_test, y_test, color = "red") plt.plot(x_train, lr.predict(x_train), color = "green") plt.title("Salary vs Experience (Testing set)") plt.xlabel("Years of Experience") plt.ylabel("Salary") plt.show()
  • 9. 9/17/2019 ML_Activity_1 - Jupyter Notebook localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 9/12 In [27]: # Making the imports import numpy as np import pandas as pd import matplotlib.pyplot as plt plt.rcParams['figure.figsize'] = (12.0, 9.0) # Preprocessing Input data data = pd.read_csv('Salary_Data.csv') X = data.iloc[:, 0] Y = data.iloc[:, 1] #Plotting Data for visualization plt.scatter(X, Y) plt.title("Salary vs Experience (Dataset set)") plt.xlabel("Years of Experience") plt.ylabel("Salary") plt.show()
  • 10. 9/17/2019 ML_Activity_1 - Jupyter Notebook localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 10/12 Optimizing parameter value like intercept "c" and slope "m" & learning rate alpha in Gradient descent formula In [26]: Prediction Slope & Intercept: 12836.600965885045 2915.2044856014018 # Building the model m = 0 c = 0 L = 0.0001 # The learning Rate #L = 0.0001 # The learning Rate #L = 0.0002 # The learning Rate #L = 0.0003 # The learning Rate epochs = 1000 # The number of iterations to perform gradient descent n = float(len(X)) # Number of elements in X # Performing Gradient Descent for i in range(epochs): Y_pred = m*X + c # The current predicted value of Y D_m = (-2/n) * sum(X * (Y - Y_pred)) # Derivative wrt m D_c = (-2/n) * sum(Y - Y_pred) # Derivative wrt c m = m - L * D_m # Update m c = c - L * D_c # Update c print("Slope & Intercept:") print (m, c)
  • 11. 9/17/2019 ML_Activity_1 - Jupyter Notebook localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 11/12 In [28]: # Making predictions Y_pred = m*X + c plt.scatter(X, Y) plt.plot([min(X), max(X)], [min(Y_pred), max(Y_pred)], color='red') # regression line plt.title("Salary vs Experience (prediction)") plt.xlabel("Years of Experience") plt.ylabel("Salary") plt.show()
  • 12. 9/17/2019 ML_Activity_1 - Jupyter Notebook localhost:8890/notebooks/ML Activity/ML_Activity_1.ipynb 12/12 Batch 1 Block 1 B.TECH ENTC Omkar Rane BETB118 Kaustubh Wankhade BETB129