Security risk in Starbucks app a 'wakeup call' for consumers

The weak protections for customer data in Starbucks' mobile-payment app is a "wakeup call" for consumers who should never assume the apps they use in their smartphones are secure.

[EFF criticizes Google for removing 'vital privacy feature' with Android 4.2.2]

Starbucks acknowledged this week that its app stores usernames, email addresses and passwords in clear text. As a result, anyone could see the information by connecting the phone to a PC.

Starbucks chose not to encrypt the data and store it on its servers in order to make the app easier to use. Taking the additional security measures would have meant having the user logon each time they used the app. By storing the data in clear text on the phone, users only had to login once, until they added more money to their account.

"The recent news that the Starbucks mobile app is not adequately protecting usernames and passwords should be a wakeup call for us -- both as mobile consumers and employees," Jack Walsh, mobility program manager at software testing and certification firm ICSA Labs, said. "No one should assume that their company's mobile apps are safe and properly secure sensitive employee or customer data."

In balancing security with usability, Starbucks apparently decided that the customer data was safe enough, because someone had to have the smartphone in their possession to get the data.

While the retailer believed that was enough security, experts disagreed.

"Any app that stores usernames and passwords should be protecting their users by encrypting their data -- especially applications oriented towards financial transactions," Lee Cocking, vice president of product and strategy at mobile security vendor Fixmo, said. "The risk of not protecting sensitive information is significant data leakage and potential financial losses."

The biggest risk posed by Starbucks' app stems from people's habit of using the same password across apps and Web sites to avoid having to remember multiple passwords, Bob O'Donnell, founder and chief analyst for TECHnalysis Research, said.

"The vast majority of people only use a few passwords, and that's a problem," O'Donnell said.

People's poor use of passwords is behind attempts by smartphone manufacturers to add biometrics as a second layer of protection on smartphones. For example, Apple added a fingerprint reader to the iPhone 5S released in September.

[Using local security to lock down your mobile device]

For companies building their own mobile apps or contracting with third-party developers, Walsh recommends independent testing.

"It shouldn't be once-and-done testing," he said. "Just like any other app you've created, whether it's a desktop app, a Web app or whatever, you want to have it tested throughout its lifecycle."

Proper security testing is not a common practice among app developers, recent studies show. In testing the iOS apps of 60 financial institutions, security firm IOActive found that all of them could be installed and run on jailbroken iPhones.

This is a security risk because jailbroken phones circumvent iOS protections and enable apps to access restricted resources of other apps.

In a separate study, RIIS LLC, a firm that specializes in mobile app development, found that Android apps provided by some of the nation's top brands in the airline, retail, entertainment and insurance industries placed users' personal information at risk.

Tags mobilestarbucks

Show Comments