Android Alarms – draining battery even when phone is idle?

By | April 24, 2016
Google Doze presentation

Google Doze presentation

Mobile devices are nowadays our mobile personal computers, where we play games, read emails, access social networks and might even do some light productivity tasks.

The mobility paradigm required many changes for developers that have now to be more aware of the connectivity differences (cable, wifi, 3/4G..), execution patterns and respective impacts on the limited energy consumption of these devices.

How do different connectivity affect energy consumption? (techy details)

Different wireless techs like LTE, 3G and WIFI have different energy consumption.

The way the different techs negotiate the resources is quite different as well, with the case for the 3G/4G being dependent on the ISP defined Radio Resource Controller (we previously descrived this in a short post – RRC). In some cases, sending a single small packet can consume as much as sending multiple packets over a few dozens of seconds due to tail times of radio channel allocation. 
energyBit

tailtime

 

 

 

 

 

Huang et al. shows that with a bulk data size of 10MB, LTE consumes 1.62 times the energy of WiFi for downlink and 2.53 for uplink. With lowest throughput, 3G has the worst energy efficiency for large data transfer, e.g., for downloading 10MB data, 3G requires 21.50 times the energy of LTE and 34.77 times the energy of WiFi, and for uplink, 7.97 times of LTE and 20.16 times of WiFi

GSMA best pratices shows how different ISP network configurations can actually impact the energy consumption of sending just a single packet of data.

 

The initial energy studies started pointing towards the screen, mobile data and cpu as the main culprits behind the reduced battery duration of mobile devices.

Some initial steps towards better energy usage started with projects like GCM (Google Cloud Messaging), when push notifications were proposed to replace the common remote server pooling to refresh app state and Google started educating developers regarding mobile networks.

What is GCM and how does it save energy? (techy details)

Pooling remote servers means that each app will be waking the phone periodically to check if the server has news (e.g., facebook status updates). This incurs an extra energy cost, not only of waking the device but also the communication cost with the server even if there are no updates.

gcm1

 

To solve this GCM proposed a notification system that notifies apps when there are updates. To do so, a Google third party serves as an intermediate between the apps and the remote servers. It works similarly to a message queuing system, mobile devices only wake when there are updates and only need to establish a single connection to serve all the apps using GCM (less keep alive messages).

This sounds wonderful, but soon companies like Facebook, that probably don’t want their data to pass through a Google service, started developing their own push notification systems (e.g., MQTT), often even not sharing the same connection across the apps of the same company (e.g., facebook messenger and main app).

For further details, there is this nice technical blog post.

 

Unfortunately Google was soon to realize that, with the increased popularity of Android and spawn rate of apps in their market, relying only on developers to make the correct design decisions is not always enough.

While GCM seemed promising, its adoption was a bit far from the expected and Google needed a new strategy to reduce the energy consumption of apps to keep the pace with opposing OSes, such as iOS, which is way more restrictive towards developers’ “misbehaving” (e.g., no direct analogue functionality to the Android Alarms).

What is an Android Alarm?

In Android, battery duration tends to decrease with the number of installed apps. And this is because apps are able to arbitrarily decide when and what to execute, even when the user is not actively interacting with the device – i.e., background tasks.

Background tasks can be defined through multiple mechanisms in Android, e.g., AsyncTasks, Timers, Services… but if you want to execute tasks periodically and with the capability to wake the device, then you should use Alarms.

Even if the device is idle apps can recur to alarms to periodically wake up the device and they can also acquire wake locks that prevent the mobile device to go to sleep.

So why are alarms a strategic milestone behind energy consumption? More specifically:

  • How are alarms related to data energy consumption?
  • How often do apps run in background?
  • Why and how are developers misusing alarms?
  • How can we improve alarm usage?
  • What is Doze and how does it address the alarms problems?

I have addressed these and many other topics in my alarms study, you can check the details in my research paper and presentation at the PAM conference:

Cheers!

Leave a Reply