By Agrim Agrawal
India’s government is mulling a new set of handset security requirements that, if notified, would bring smartphones closer to the kinds of compliance regime old telecom networks and critical infrastructure were used to. Draft provisions that Reuters had seen would have put manufacturers through vulnerability analysis, using their operating systems, while also giving government-designated labs the authority to verify those claims via source-code review at the industry’s most fiercely protected intellectual property.
It also identifies the major privacy and abuse issue of modern smartphones: permissions. Applications would have only background access to camera, microphone and location services when the device is inactive, as well as status bar information on the status of the app when certain permission details are in use. Phones would also periodically prod users to check application permissions which device-makers say might be hard to methodically test regularly and would change the foundational user experience of Android variants and iOS. Even more controversial are clauses that will affect how quickly security patches are available to users.
The draft concludes that companies should alert a government body before delivering “major” software updates or patches, which includes mandatory 12-month retention of security audit logs (app installs tried, login attempts, etc.), regular malware scanning, tamper-detection alerts in the event that a device has root/jailbreak and “anti-rollback” protections for preventing older versions of developed software from actually being installed. Industry group MAIT also fought back, contending that phones just did not have enough storage to handle year-long log entries and that the software patching process was fraught with procedural friction, leaving users exposed within periods of active exploitation. But the source-code clause is the place, technically, where the architecture of mobile security and the regulatory national oversight converge.
For Apple there’s no way around that iOS is proprietary end to end; for Google the open-source core (AOSP) of Android is only one piece of what ships onto devices, and vendor customizations, proprietary services, firmware or secure components are the real attack surface. So a mandatory “review” could devolve into a cumbersome, expensive compliance dance between silicon partners and OEMs especially if labs need build reproducibility, symbol files and access to security modules that suppliers tell you are trade secrets. But the government is pointing to the fact that the story is still fluid. India’s Press Information Bureau opened up and openly assaulted assertions that rules would mandate manufacturers to give their source code after it exposed that Reuters report, saying this type of reporting was misleading and that it’s aimed at reducing fraud and building mobile security.
Civil society groups, including the Internet Freedom Foundation, have warned that policies like those would create a “honeypot” of sensitive code and logs, and are trying to establish public consultation. What results is the familiar trade-off between policy and scale: India calls for stronger guarantees on hundreds of millions of devices it uses to make payments and for identity-linked services, but the tech players around the world are keeping pace, worried about what the implications could be — a precedent of proprietary system exposure that can be quickly patched, nuances in the marketplace, friction in India’s fast-growing premium smartphone sector.
