Abstract:
A single bit transmission over the wireless card can consume 1000 times more energy as compared to a 32-bit CPU computation. This fact is very critical for devices operating on limited battery such as mobile devices. By applying data compression, the intended information is sent with lower number of bits and thus reducing transmission energy. However, the computational as well as memory access requirements of the compression algorithm used could consume more energy than simply transmitting data uncompressed. Moreover, if the transmission rate over the wireless medium is high then the need for compression might be reduced or even eliminated as data is transferred within a minimal amount of time. In light of this, the compression option is investigated through conducting experimental work in different scenarios to record the energy consumption of data transfer from one mobile device to another. Energy results show that compression does not lead to energy gains at all times. Whenever the received signal strength at the mobile device is high, no compression is needed. However, compression is profitable whenever the signal strength gets weak. In this thesis, an adaptive scheme is proposed that monitors the signal strength during the transmission process and compresses data on-the-fly whenever energy gain is promised otherwise sends data uncompressed. This thesis shows by means of experimental work and simulation that such an adaptive scheme results in significant gains in energy consumption as compared to other related approaches proposed in the literature. Based on the experimental results, an empirical model is finally derived to estimate the energy consumption of a given transmission.