OPTIMIZING CLOUD RESOURCE ALLOCATION FOR IOT SYSTEMS USING ML APPROACHES

  • Unique Paper ID: 167326
  • Volume: 4
  • Issue: 3
  • PageNo: 294-300
  • Abstract:
  • This research investigates the use of several machine learning models to optimise cloud resource allocation in Internet of Things applications. Using a collection of resource allocation measurements from many IoT implementations, a thorough analysis was carried out, evaluating models for cost effectiveness, resource utilisation, and forecast accuracy. The findings show that, with a Mean Absolute Error (MAE) of 2.89 and a Root Mean Square Error (RMSE) of 4.98, XGBoost had the best prediction accuracy. The Neural Network came in second, with an MAE of 3.01 and an RMSE of 5.12. Moreover, Random Forest performed admirably, showing an RMSE of 5.34 and an MAE of 3.12. XGBoost and Neural Networks had the greatest average CPU and memory utilisation, at 33.5% and 35.8%, respectively, in terms of resource utilisation. Decision trees demonstrated reduced resource use, with an average CPU utilisation of 28.7% and memory usage of 110 MB, but being significantly less accurate (MAE of 4.56, RMSE of 6.78). Cost analysis indicated that XGBoost incurred the highest total monthly cost at $2700, followed by Neural Networks at $2800. In contrast, Decision Trees proved to be the most cost-effective with a total monthly cost of $2400. The study concludes that while XGBoost and Neural Networks provide superior accuracy, their higher operational costs may not be justified in all scenarios. Decision Trees, though less accurate, present a more cost-effective solution, making them suitable for environments with strict budget constraints.
email to a friend

Cite This Article

  • ISSN: 2349-6002
  • Volume: 4
  • Issue: 3
  • PageNo: 294-300

OPTIMIZING CLOUD RESOURCE ALLOCATION FOR IOT SYSTEMS USING ML APPROACHES

Related Articles