A calibration curve is a mathematical relationship between the instrument's signal and known analyte concentrations. This curve equation predicts the unknown concentration of a sample. The experimental data may not lie perfectly on a straight line due to random errors. So, the linear least squares method – a regression analysis is used to obtain a straight line that best fits the different points. LLS is based on two assumptions. Firstly, a linear relationship exists between the instrument's signal and the analyte concentration. Secondly, the errors are attributed to random error, not human error. The best-fitting line is drawn by minimizing the sum of the squared differences between the estimated and the actual values. The recorded plot yields the equation of the line. Here, y is the instrument's signal, x is the analyte concentration, m is the slope of the line, and c is the y-intercept. The unknown concentration of the sample is determined by measuring its instrumental signal and substituting the appropriate values in the equation.