A review on dynamic resource allocation in Industrial Internet of Things using Machine Learning

Keywords:
Resource Allocation, Internet of Things , Industrial IoT, Machine learning , Computer ScienceAbstract
Internet of Things (IoT) application solutions are getting more common and their usage areas are expanding. As a result, there is continuous advances in development of new IoT technologies. The new technology development has huge advantages however, it also has limitations. Resource management and allocation is one of the biggest problems faced by IoT systems. Industrial IoT operations for computing and storing data use fog, edge computing, or cloud techniques. These computing nodes receive data from devices with limited resources. For computing and storage, resource management decisions must be made in fog, the cloud or edge nodes. For the system to function properly, resource management and allocation must be accurate and complete. There are many approaches suggested for this including one based on Machine Learning. This paper examined the Internet of Things resource allocation and management based on machine learning. In addition, various literature on resource allocation using reinforcement learning, optimization, and machine learning is compared in terms of several performance characteristics. Future research can concentrate on the suggested work's scalability and also expand and apply it in a larger and more sophisticated model.
URN:NBN:sciencein.jist.2023.v11.571
Downloads
Downloads
Published
Issue
Section
URN
License
Copyright (c) 2023 Pankaj Singh Sisodiya, Vijay Bhandari

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Rights and Permission