Notice Board :





Volume XVII Issue VII

Author Name
Shivani Yadav, Hitesh Soni, Kamlesh Patidar
Year Of Publication
2025
Volume and Issue
Volume 17 Issue 7
Abstract
Regression analysis is a cornerstone of predictive analytics, used extensively in science, engineering, and business to model relationships between variables. A fundamental challenge is finding parameter estimates that minimize the cost function, ensuring the best possible model fit. Gradient Descent (GD), a first-order iterative optimization algorithm, has emerged as a preferred approach for this purpose due to its simplicity, scalability, and effectiveness. This paper explores the principles of Gradient Descent, its advantages over traditional analytical solutions, its practical implementation in linear regression, and its performance in minimizing the Mean Squared Error (MSE). Experiments demonstrate how GD provides a robust, flexible approach to finding optimal regression parameters, especially for large and complex datasets where closed-form solutions may be impractical.
PaperID
2025/EUSRM/7/2025/61696

Author Name
Ashu Batham, Pawan Singh Rajput
Year Of Publication
2025
Volume and Issue
Volume 17 Issue 7
Abstract
In the modern data-driven business environment, market sales analytics has emerged as a vital component in strategic decision-making. Machine learning (ML) plays a pivotal role in uncovering hidden patterns, predicting sales trends, and understanding customer behavior. This review paper explores the application of both supervised and unsupervised machine learning techniques in market sales analytics. We examine popular algorithms, their comparative performance, and their practical applications in real-world scenarios. Furthermore, we discuss the challenges, opportunities, and future directions in the field to support businesses in building data-informed strategies.
PaperID
2025/EUSRM/7/2025/61695

Author Name
Swati Yadav, Nilesh Parmar, Kamlesh Patidar
Year Of Publication
2025
Volume and Issue
Volume 17 Issue 7
Abstract
Multicollinearity can be a problem in a regression model because we would not be able to distinguish between the individual effects of the independent variables on the dependent variable. Multicollinearity may not affect the accuracy of the model as much. But we might lose reliability in determining the effects of individual features in your model and that can be a problem when it comes to interpretability. Multicollinearity is the presence of high correlations between two or more independent variables (predictors). Correlation is the association between variables, and it tells us the measure of the extent to which two variables are related to each other. Two variables can have positive (change in one variable causes change in another variable in the same direction), negative (change in one variable causes change in another variable in the opposite direction), or no correlation. A simple example of positive correlation can be weight and height. A simple example of a negative correlati
PaperID
2025/EUSRM/7/2025/61700