(RP03) How to Perform Simple Linear Regression in R

Linear regression is like finding the perfect match on a dating app – it’s all about finding that sweet spot where two variables are meant to be together. Just like a good relationship, we want our data to have a positive correlation, no pattern in the residuals, and a normally distributed error. It’s all about finding that perfect fit! ๐Ÿ“ˆ๐Ÿ”


In this video, we will delve into some of the basics of coding in the R language, particularly focusing on bivar data and the common analysis methods such as covariance, correlation, and regression lines between two numerical variables.

Basics of Bivar Data

We will start by creating an artificial data set using the rnorm function to generate a normally distributed set of data. This will enable us to explore linear correlation and regression analysis, laying the foundation for more advanced methods in the future.

Creating the Data Set

We use the rnorm function to create a normal distribution for our data set, ensuring that it is relatively normally distributed with specified mean and standard deviation values.

VariableMeanStandard Deviation

Generating the Error Vector

We also generate an error vector to introduce a level of variance into our data set, further enhancing the linear correlation between variables X and Y.

Analysis of the Data Set

Next, we will analyze the distributions, covariance, and correlation between X and Y, calculating key metrics such as the mean, standard deviation, and correlation coefficient.

Mean of X45.6
Standard Deviation of Y7.2
Covariance (SX, SY)
Correlation (r)0.087

Calculating the Regression Line

With the basic analysis completed, we move on to calculate the slope and intercept of the regression line. We then plot the linear regression line on the scatter plot of X and Y values to visually represent the linear correlation.

Residual Analysis

To ensure the appropriateness of the linear regression, we analyze the residuals to check for symmetry, randomness, and normal distribution. We also calculate the sum of squared residuals and root mean square error to assess the overall model fit.

Residual MetricValue
Sum of Squared Residuals44,000
Root Mean Square Error14


In conclusion, we have explored the basics of simple linear regression in R, from creating the data set to analyzing the linear correlation and regression line. We have also demonstrated the importance of residual analysis in evaluating the appropriateness of the regression model.

Key Takeaways:

  • Simple linear regression is a fundamental method for analyzing linear relationships between two variables.
  • Residual analysis is crucial in ensuring the suitability of the regression model.

We hope you found this video informative and look forward to sharing more advanced concepts in future videos. Stay tuned!

Take care and happy coding!

About the Author

Let’s Learn, Nemo!
23.3K subscribers

About the Channel๏ผš

Hello there! This channel is meant to provide a thorough treatment of most mathematical subjects that are encountered in universities. The goal’s to update playlists as they should be updated when new topics come about in applications to the real world and update videos in case the audience does not like them; the goal is to help you gain a better understanding and to increase your strength and interested in mathematics and statistics!For those who are new to mathematics in general, or want to start somewhere on this channel, Pre-Calculus (Part A) is the best place to start. I include topics that are appropriate to the difficulty in those playlists, even though they may not be (currently) taught in colleges and universities. For example, I introduce tetrations and Lambert W in Pre-Calculus, and Gamma and Riemann Zeta functions in Integral Calculus (to name a few). If you enjoy, please subscribe, and feel free to leave comments on videos if you have questions; I will respond! Enjoy!
Share the Post: