This calculator performs comprehensive analysis of the linear relationship between two continuous variables. It provides everything you need for professional statistical analysis, from basic model fitting to advanced diagnostic testing, ensuring your regression model meets all statistical assumptions and delivers reliable insights.
💡 Pro Tip: Always examine the diagnostic plots before interpreting your results! The residuals vs fitted plot reveals non-linear patterns, while the Q-Q plot checks normality assumptions. For multiple regression analysis, check out our Multiple Linear Regression Calculator to analyze relationships with several predictor variables.
Ready to explore the linear relationship in your data? to see the required data format and regression analysis in action, or upload your own data to discover the strength and direction of the relationship between your variables.
Simple Linear Regression models the relationship between a predictor variable (X) and a response variable (Y) using a linear equation. It finds the line that minimizes the sum of squared residuals.
Regression Line:
Slope:
Intercept:
R-squared:
| 1 | 2.1 | -2 | -3.82 | 4 | 7.64 |
| 2 | 3.8 | -1 | -2.12 | 1 | 2.12 |
| 3 | 6.2 | 0 | 0.28 | 0 | 0 |
| 4 | 7.8 | 1 | 1.88 | 1 | 1.88 |
| 5 | 9.3 | 2 | 3.38 | 4 | 6.76 |
Means: ,
(98.6% of variation in Y explained by X)
library(tidyverse)
data <- tibble(x = c(1, 2, 3, 4, 5),
y = c(2.1, 3.8, 6.2, 7.8, 9.3))
model <- lm(y ~ x, data=data)
summary(model)
ggplot(data, aes(x = x, y = y)) +
geom_point() +
geom_smooth(method = "lm", se = FALSE) +
theme_minimal()
par(mfrow = c(2, 2))
plot(model)import numpy as np
import pandas as pd
import statsmodels.api as sm
X = [1, 2, 3, 4, 5]
y = [2.1, 3.8, 6.2, 7.8, 9.3]
X = sm.add_constant(X)
model = sm.OLS(y, X).fit()
print(model.summary())Consider these alternatives when assumptions are violated:
This calculator performs comprehensive analysis of the linear relationship between two continuous variables. It provides everything you need for professional statistical analysis, from basic model fitting to advanced diagnostic testing, ensuring your regression model meets all statistical assumptions and delivers reliable insights.
💡 Pro Tip: Always examine the diagnostic plots before interpreting your results! The residuals vs fitted plot reveals non-linear patterns, while the Q-Q plot checks normality assumptions. For multiple regression analysis, check out our Multiple Linear Regression Calculator to analyze relationships with several predictor variables.
Ready to explore the linear relationship in your data? to see the required data format and regression analysis in action, or upload your own data to discover the strength and direction of the relationship between your variables.
Simple Linear Regression models the relationship between a predictor variable (X) and a response variable (Y) using a linear equation. It finds the line that minimizes the sum of squared residuals.
Regression Line:
Slope:
Intercept:
R-squared:
| 1 | 2.1 | -2 | -3.82 | 4 | 7.64 |
| 2 | 3.8 | -1 | -2.12 | 1 | 2.12 |
| 3 | 6.2 | 0 | 0.28 | 0 | 0 |
| 4 | 7.8 | 1 | 1.88 | 1 | 1.88 |
| 5 | 9.3 | 2 | 3.38 | 4 | 6.76 |
Means: ,
(98.6% of variation in Y explained by X)
library(tidyverse)
data <- tibble(x = c(1, 2, 3, 4, 5),
y = c(2.1, 3.8, 6.2, 7.8, 9.3))
model <- lm(y ~ x, data=data)
summary(model)
ggplot(data, aes(x = x, y = y)) +
geom_point() +
geom_smooth(method = "lm", se = FALSE) +
theme_minimal()
par(mfrow = c(2, 2))
plot(model)import numpy as np
import pandas as pd
import statsmodels.api as sm
X = [1, 2, 3, 4, 5]
y = [2.1, 3.8, 6.2, 7.8, 9.3]
X = sm.add_constant(X)
model = sm.OLS(y, X).fit()
print(model.summary())Consider these alternatives when assumptions are violated: