Transmission Policies for Energy Harvesting Sensors Based on Markov Chain Energy Supply

Authors

DOI:

https://doi.org/10.4108/eai.28-9-2015.2261406

Keywords:

energy harvesting, wireless sensor networks, energy management, markov decision process

Abstract

Due to the small energy harvesting rates and stochastic energy harvesting processes, energy management of energy harvesting senor is still crucial for body network. Transmission polices for energy harvesting sensors with Markov chain energy supply over time varying channels is formulated as an infinite discounted reward Markov Decision Problem under the assumption of geometric distribution of sensors’ lifetime. In this paper, we firstly propose a low-storage transmission policy based on probability of successful transmission for body network. Then we narrow the feasible region of parameters in our policies from the real domain to a discrete set with limited number, which makes the method of combing optimal equations and enumeration algorithm to obtain optimal parameters workable. Finally, numerical results show that our presented transmission policies can achieve a good approximated performance of optimal policies, which can be derived by policy iteration algorithm. Compared with the optimal policies, our presented policies has the advantage of low storage.

Downloads

Download data is not yet available.

Downloads

Published

14-12-2015

How to Cite

1.
Zhu W, Xu P, Zheng M, Wu G, Wang H. Transmission Policies for Energy Harvesting Sensors Based on Markov Chain Energy Supply. EAI Endorsed Trans Energy Web [Internet]. 2015 Dec. 14 [cited 2024 Nov. 22];3(8):e4. Available from: https://publications.eai.eu/index.php/ew/article/view/1056

Funding data