A new app called Neon recently experienced a viral rise to the top of Apple‘s App Store charts. Its business model was simple: it offered to pay users for their phone call recordings, with the stated purpose of selling that data to AI companies to help train their models. But the app’s moment in the spotlight was abruptly cut short by a serious security incident. If you were one of the iPhone users who accepted the conditions of the Neon app, your data may have been exposed.
Neon iPhone app’s security flaw exposed call recordings, phone numbers, and more
The app has already been taken down from the App Store listings. The main issue at the heart of the app’s shutdown was a critical security flaw. This vulnerability allowed any logged-in user to access the sensitive, private data of other users. The problem was not the result of a hack from an outside party. It actually was a fundamental flaw in the app’s server setup. They simply failed to properly authenticate user requests. This means that anyone with a little technical knowledge could easily pull up information belonging to someone else.
The vulnerability exposed a wide range of sensitive information. Experts at TechCrunch who investigated the flaw found they could access a user’s phone number, the phone number of the person they called, and even the full audio recordings and detailed transcripts of those conversations. The data was accessible through publicly available web links. The investigation also revealed call records, or metadata. This included the time and duration of each call, for instance. The discovery highlighted a troubling reality: some users appeared to be making lengthy, covert calls specifically to generate money from the app’s payment system.
Company avoids mentioning the data leak in its official statement
After being notified of the security lapse, the app’s founder took the servers offline. In a message sent to users, the company cited a need to “add extra layers of security” during a period of rapid growth. The message, however, does not mention the security flaw or the fact that users’ personal data had been exposed. This response raises significant questions about transparency and the responsibility of companies handling sensitive information.