Friday, 1 February 2019

Everything you need to know about Facebook, Google’s app scandal

Facebook and Google landed in hot water with Apple this week after two investigations by TechCrunch revealed the misuse of internal-only certificates — leading to their revocation, which led to a day of downtime at the two tech giants.

Confused about what happened? Here’s everything you need to know.

How did all this start, and what happened?

On Monday, we revealed that Facebook was misusing an Apple-issued certificate that is only meant for companies to use to distribute internal, employee-only apps without having to go through the Apple App Store. But the social media giant used that certificate to sign an app that Facebook distributed outside the company, violating Apple’s rules.

The app, known simply as “Research,” allowed Facebook access to all the data flowing out of the device it was installed on. Facebook paid users — including teenagers — $20 per month to install the app. But it wasn’t clear exactly what kind of data was being vacuumed up, or for what reason.

It turns out that the app was a repackaged app that was effectively banned from Apple’s App Store last year for collecting too much data on users.

Apple was angry that Facebook was misusing its special-issue certificates to push an app it already banned, and revoked it — rendering the app useless. But Facebook was using that same certificate to sign its other employee-only apps, effectively knocking them offline until Apple re-issued the certificate.

Then, it turned out Google was doing almost exactly the same thing with its Screenwise app, and Apple’s ban-hammer fell again.

What’s the controversy over these certificates and what can they do?

If you want to develop Apple apps, you have to abide by its rules.

A key rule is that Apple doesn’t allow app developers to bypass the App Store, where every app is vetted to ensure it’s as secure as it can be. It does, however, grant exceptions for enterprise developers, such as to companies that want to build apps that are only used internally by employees. Facebook and Google in this case signed up to be enterprise developers and agreed to Apple’s developer terms.

Apple granted each a certificate that grants permission to distribute apps they develop internally — including pre-release versions of the apps they make, for testing purposes. But these certificates aren’t allowed to be used for ordinary consumers, as they have to download apps through the App Store.

Why is “root” certificate access a big deal?

Because Facebook’s Research and Google’s Screenwise apps were distributed outside of Apple’s App Store, it required users to manually install the app — known as sideloading. That requires users to go through a convoluted few steps of downloading the app itself, and opening and installing either Facebook or Google’s certificate.

Both apps then required users to open another certificate — known as a VPN configuration profile — allowing all of the data flowing out of that user’s phone to funnel down a special tunnel that directs it all to either Facebook or Google, depending on the app you installed.

This is where Facebook and Google’s cases differ.

Google’s app collected data and sent it off to Google for research purposes, but couldn’t access encrypted data — such as iMessages, or other end-to-end encrypted content.

Facebook, however, went far further. Its users were asked to go through an additional step to trust the certificate at the “root” level of the phone. Trusting this “root certificate” allowed Facebook to look at all of the encrypted traffic flowing out of the device — essentially what we call a “man-in-the-middle” attack. That allowed Facebook to sift through your messages, your emails, and any other bit of data that leaves your phone. Only apps that use certificate pinning — which reject any certificate that isn’t its own — were protected.

Facebook’s Research app requires Root Certificate access, which Facebook gather almost any piece of data transmitted by your phone. (Image: supplied)

Google’s app might not have been able to look at encrypted traffic, but the company still flouted the rules and got its certificate revoked anyway.

What data did Facebook have access to on iOS?

It’s hard to know for sure, but it definitely had access to more data than Google.

Facebook said its app was to help it “understand how people use their mobile devices.” In reality, at root traffic level, Facebook could have accessed any kind of data that left your phone.

Will Strafach, a security expert who we spoke to for our story, said: “If Facebook makes full use of the level of access they are given by asking users to install the certificate, they will have the ability to continuously collect the following types of data: private messages in social media apps, chats from in instant messaging apps – including photos/videos sent to others, emails, web searches, web browsing activity, and even ongoing location information by tapping into the feeds of any location tracking apps you may have installed.”

Remember: this isn’t “root” access to your phone, like jailbreaking, but root access to the network traffic.

How does this compare to the technical ways other market research programs work?

In fairness, these aren’t market research apps unique to Facebook or Google. Several other companies, like Nielsen and comScore, run similar programs, but neither ask users to install a VPN or provide root access to the network.

In any case, Facebook already has a lot of your data — as does Google. Even if the companies only wanted to look at your data in aggregate with other people, it can still hone in on who you talk to, when, for how long, and in some cases what about. It might not have been such an explosive scandal had Facebook not spent the last year cleaning up after several security and privacy breaches.

Can they capture the data of people the phone owner interacts with?

In both cases, yes. In Google’s case, any unencrypted data that involves another person’s data could have been collected. In Facebook’s case, it goes far further — any data of yours that interacts with another person, such as an email or a message, could have been collected by Facebook’s app.

How many people did this affect?

It’s hard to know for sure. Neither Google nor Facebook have said how many users they have. Between them, it’s believed to be in the thousands. As for the employees affected by the app outages, Facebook has more than 35,000 employees and Google has more than 94,000 employees.

Why did internal apps at Facebook and Google break after Apple revoked the certificates?

You might own your Apple device, but Apple still gets to control what goes on it.

After Facebook was caught out, Apple said: “Any developer using their enterprise certificates to distribute apps to consumers will have their certificates revoked, which is what we did in this case to protect our users and their data.” That meant any app that relied on the certificate — including inside the company — would fail to load. That’s not just pre-release builds of Facebook, Instagram and WhatsApp that staff were working on, but reportedly the company’s travel and collaboration apps were down. In Google’s case, even its catering and lunch menu apps were down.

Facebook’s internal apps were down for about a day, while Google’s internal apps were down for a few hours. None of Facebook or Google’s consumer services were affected, however.

How are people viewing Apple in all this?

Nobody seems thrilled with Facebook or Google at the moment, but not many are happy with Apple, either. Even though Apple sells hardware and doesn’t use your data to profile you or serve you ads — like Facebook and Google do — some are uncomfortable with how much power Apple has over the customers — and enterprises — that use its devices.

In revoking Facebook and Google’s enterprise certificates and causing downtime, it has a knock-on effect internally.

Is this legal in the U.S.? What about in Europe with GDPR?

Well, it’s not illegal — at least in the U.S. Facebook says it gained consent from its users. The company even said its teenage users must obtain parental consent, even though it was easily skippable and no verification checks were made. It wasn’t even explicitly clear that the children who “consented” really understood how much privacy they were really handing over.

That could lead to major regulatory headaches down the line. “If it turns out that European teens have been participating in the research effort Facebook could face another barrage of complaints under the bloc’s General Data Protection Regulation (GDPR) — and the prospect of substantial fines if any local agencies determine it failed to live up to consent and ‘privacy by design’ requirements baked into the bloc’s privacy regime,” wrote TechCrunch’s Natasha Lomas.

Who else have been misusing certificates?

Don’t think that Facebook and Google are alone in this. It turns out that a lot of companies might be flouting the rules, too.

According to many finding companies on social media, Sonos uses enterprise certificates for its beta program, as does finance app Binance, as well as DoorDash for its fleet of contractors. It’s not known if Apple will also revoke their certificates.

What next?

It’s anybody’s guess, but don’t expect this situation to die down any time soon.

Facebook may face repercussions with Europe, as well as at home. Two U.S. senators, Mark Warner and Richard Blumenthal, have already called for action, accusing Facebook of “wiretapping teens.” The Federal Trade Commission may also investigate, if Blumenthal gets his way.



from Apple – TechCrunch https://tcrn.ch/2HLKWoY

Apple fixes FaceTime eavesdrop bug, with software update incoming

Three days after Apple pulled its new Group FaceTime feature offline after users found they could eavesdrop on people before accepting a call, the company says it’s fixed the bug on its end.

“We have fixed the Group FaceTime security bug on Apple’s servers and we will issue a software update to re-enable the feature for users next week,” said Apple in a statement. “We sincerely apologize to our customers who were affected and all who were concerned about this security issue. We appreciate everyone’s patience as we complete this process.”

The bug allowed anyone to swipe up and add themselves to a Group FaceTime call, a new group video feature that Apple introduced last year. TechCrunch verified the bug after it began making the rounds on social media.

To prevent misuse, Apple pulled the plug on Group FaceTime on its servers.

Apple continued: “We want to assure our customers that as soon as our engineering team became aware of the details necessary to reproduce the bug, they quickly disabled Group FaceTime and began work on the fix.”

But the privacy issue came after reports that a 14-year-old from Arizona and his mother tried to report the bug to Apple days before to no avail, citing difficulties in contacting the company.

In Friday’s statement, Apple thanked the Thompson family for reporting the bug,

“We are committed to improving the process by which we receive and escalate these reports, in order to get them to the right people as fast as possible. We take the security of our products extremely seriously and we are committed to continuing to earn the trust Apple customers place in us,” the statement added.

New York’s attorney general Letitia James and governor Andrew Cuomo said they would investigate the incident.



from Apple – TechCrunch https://tcrn.ch/2RZ0v1b

First China, now Starbucks gets an ambitious VC-funded rival in Indonesia

Asia’s venture capital-backed startups are gunning for Starbucks.

In China, the U.S. coffee giant is being pushed by Luckin Coffee, a $2.2 billion challenger surfing China’s on-demand wave, and on the real estate side, where WeWork China has just unveiled an on-demand product that could tempt people who go to Starbucks to kill time or work.

That trend is picking up in Indonesia, the world’s fourth largest country and Southeast Asia’s largest economy, where an on-demand challenger named Fore Coffee has fuelled up for a fight after it raised $8.5 million.

Fore was started in August 2018 when associates at East Ventures, a prolific early-stage investor in Indonesia, decided to test how robust the country’s new digital infrastructure can be. That means it taps into unicorn companies like Grab, Go-Jek and Tokopedia and their army of scooter-based delivery people to get a hot brew out to customers. Incidentally, the name ‘Fore’ comes from ‘forest’ — “we aim to grow fast, strong, tall and bring life to our surrounding” — rather than in front of… or a shout heard on the golf course.

The company has adopted a similar hybrid approach to Luckin, and Starbucks thanks to its alliance with Alibaba. Fore operates 15 outlets in Jakarta, which range from ‘grab and go’ kiosks for workers in a hurry, to shops with space to sit and delivery-only locations, Fore co-founder Elisa Suteja told TechCrunch. On the digital side, it offers its own app (delivery is handled via Tokopedia’s Go-Send service) and is available via Go-Jek and Grab’s apps.

So far, Fore has jumped to 100,000 deliveries per month and its app is top of the F&B category for iOS and Android in Indonesia — ahead of Starbucks, McDonald’s and Pizza Hut.

It’s early times for the venture — which is not a touch on Starbuck’s $85 billion business; it does break out figures for Indonesia — but it is a sign of where consumption is moving to Indonesia, which has become a coveted beachhead for global companies, and especially Chinese, moving into Southeast Asia. Chinese trio Tencent, Alibaba and JD.com and Singapore’s Grab are among the outsiders who have each spent hundreds of millions to build or invest in services that tap growing internet access among Indonesia’s population of over 260 million.

There’s a lot at stake. A recent Google-Temasek report forecast that Indonesia alone will account for over 40 percent of Southeast Asia’s digital economy by 2025, which is predicted to triple to reach $240 billion.

As one founder recently told TechCrunch anonymously: “There is no such thing as winning Southeast Asia but losing Indonesia. The number one priority for any Southeast Asian business must be to win Indonesia.”

Forecasts from a recent Google-Temasek report suggest that Indonesia is the key market in Southeast Asia

This new money comes from East Ventures — which incubated the project — SMDV, Pavilion Capital, Agaeti Venture Capital and Insignia Ventures Partners with participation from undisclosed angel backers. The plan is to continue to invest in growing the business.

“Fore is our model for ‘super-SME’ — SME done right in leveraging technology and digital ecosystem,” Willson Cuaca, a managing partner at East Ventures, said in a statement.

There’s clearly a long way to go before Fore reaches the size of Luckin, which has said it lost 850 million yuan, or $124 million, inside the first nine months in 2018.

The Chinese coffee challenger recently declared that money is no object for its strategy to dethrone Starbucks. The U.S. firm is currently the largest player in China’s coffee market, with 3,300 stores as of last May and a goal of topping 6,000 outlets by 2022, but Luckin said it will more than double its locations to more than 4,500 by the end of this year.

By comparison, Indonesia’s coffee battle is only just getting started.



from Android – TechCrunch https://tcrn.ch/2Gff9dA
via IFTTT

Thursday, 31 January 2019

We dismantle Facebook’s memo defending its “Research”

Facebook published an internal memo today trying to minimize the morale damage of TechCrunch’s investigation that revealed it’d been paying people to suck in all their phone data. Attained by Business Insider’s Rob Price, the memo from Facebook’s VP of production engineering and security Pedro Canahuati gives us more detail about exactly what data Facebook was trying to collect from teens and adults in the US and India. But it also tries to claim the program wasn’t secret, wasn’t spying, and that Facebook doesn’t see it as a violation of Apple’s policy against using its Enterprise Certificate system to distribute apps to non-employees — despite Apple punishing it for the violation.

For reference, Facebook was recruiting users age 13-35 to install a Research app, VPN, and give it root network access so it could analyze all their traffic. It’s pretty sketchy to be buying people’s privacy, and despite being shut down on iOS, it’s still running on Android.

Here we lay out the memo with section by section responses to Facebook’s claims challenging TechCrunch’s reporting. Our responses are in bold and we’ve added images.

Memo from Facebook VP Pedro Canahuati

APPLE ENTERPRISE CERTS REINSTATED

Early this morning, we received agreement from Apple to issue a new enterprise certificate; this has allowed us to produce new builds of our public and enterprise apps for use by employees and contractors. Because we have a few dozen apps to rebuild, we’re initially focusing on the most critical ones, prioritized by usage and importance: Facebook, Messenger, Workplace, Work Chat, Instagram, and Mobile Home.

New builds of these apps will soon be available and we’ll email all iOS users for detailed instructions on how to reinstall. We’ll also post to iOS FYI with full details.

Meanwhile, we’re expecting a follow-up article from the New York Times later today, so I wanted to share a bit more information and background on the situation.

What happened?

On Tuesday TechCrunch reported on our Facebook Research program. This is a market research program that helps us understand consumer behavior and trends to build better mobile products.

TechCrunch implied we hid the fact that this is by Facebook – we don’t. Participants have to download an app called Facebook Research App to be involved in the stud. They also characterized this as “spying,” which we don’t agree with. People participated in this program with full knowledge that Facebook was sponsoring this research, and were paid for it. They could opt-out at any time. As we built this program, we specifically wanted to make sure we were as transparent as possible about what we were doing, what information we were gathering, and what it was for — see the screenshots below.

We used an app that we built ourselves, which wasn’t distributed via the App Store, to do this work. Instead it was side-loaded via our enterprise certificate. Apple has indicated that this broke their Terms of Service so disabled our enterprise certificates which allow us to install our own apps on devices outside of the official app store for internal dogfooding.

Author’s response: To start, “build better products” is a vague way of saying determining what’s popular and buying or building it. Facebook has used competitive analysis gathered by its similar Onavo Protect app and Facebook Research app for years to figure out what apps were gaining momentum and either bring them in or box them out. Onavo’s data is how Facebook knew WhatsApp was sending twice as many messages as Messenger, and it should invest $19 billion to acquire it.

Facebook claims it didn’t hide the program, but it was never formally announced like every other Facebook product. There were no Facebook Help pages, blog posts, or support info from the company. It used intermediaries Applause (which owns uTest) and CentreCode (which owns Betabound) to run the program under names like Project Atlas and Project Kodiak. Users only found out Facebook was involved once they started the sign-up process and signed a non-disclosure agreement prohibiting them from discussing it publicly.

TechCrunch has reviewed communications indicating Facebook would threaten legal action if a user spoke publicly about being part of the Research program. While the program had run since 2016, it had never been reported on. We believe that these facts combined justify characterizing the program as “secret”

The Facebook Research program was called Project Atlas until you signed up

How does this program work?

We partner with a couple of market research companies (Applause and CentreCode) to source and onboard candidates based in India and USA for this research project. Once people are onboarded through a generic registration page, they are informed that this research will be for Facebook and can decline to participate or opt out at any point. We rely on a 3rd party vendor for a number of reasons, including their ability to target a Diverse and representative pool of participants. They use a generic initial Registration Page to avoid bias in the people who choose to participate.

After generic onboarding people are asked to download an app called the ‘Facebook Research App,’ which takes them through a consent flow that requires people to check boxes to confirm they understand what information will be collected. As mentioned above, we worked hard to make this as explicit and clear as possible.

This is part of a broader set of research programs we conduct. Asking users to allow us to collect data on their device usage is a highly efficient way of getting industry data from closed ecosystems, such as iOS and Android. We believe this is a valid method of market research.

Author’s response: Facebook claims it wasn’t “spying”, yet it never fully laid out the specific kinds of information it would collect. In some cases, descriptions of the app’s data collection power were included in merely a footnote. The program did not specify specific data types gathered, only saying it would scoop up “which apps are on your phone, how and when you use them” and “information about your internet browsing activity”

The parental consent form from Facebook and Applause lists none of the specific types of data collected or the extent of Facebook’s access. Under “Risks/Benefits”, the form states “There are no known risks associated with this project however you acknowledge that the inherent nature of the project involves the tracking of personal information via your child’s use of Apps. You will be compensated by Applause for your child’s participation.” It gives parents no information about what data their kids are giving up.

Facebook claims it uses third-parties to target a diverse pool of participants. Yet Facebook conducts other user feedback and research programs on its own without the need for intermediaries that obscure its identity, and only ran the program in two countries. It claims to use a generic signup page to avoid biasing who will choose to participate, yet the cash incentive and technical process of installing the root certificate also bias who will participate, and the intermediaries conveniently prevent Facebook from being publicly associated with the program at first glance. Meanwhile, other clients of the Betabound testing platform like Amazon, Norton, and SanDisk reveal their names immediately before users sign up.

Facebook’s ads recruiting teens for the program didn’t disclose its involvement

Did we intentionally hide our identity as Facebook?

No — The Facebook brand is very prominent throughout the download and installation process, before any data is collected. Also, the app name of the device appears as “Facebook Research” — see attached screenshots. We use third parties to source participants in the research study, to avoid bias in the people who choose to participate. But as soon as they register, they become aware this is research for Facebook

Author’s response: Facebook here admits that users did not know Facebook was involved before they registered.

What data do we collect? Do we read people’s private messages?

No, we don’t read private messages. We collect data to understand how people use apps, but this market research was not designed to look at what they share or see. We’re interested in information such as watch time, video duration, and message length, not that actual content of videos, messages, stories or photos. The app specifically ignores information shared via financial or health apps.

Author’s response: We never reported that Facebook was reading people’s private messages, but that it had the ability to collect them. Facebook here admits that the program was “not designed to look at what they share or see”, but stops far short of saying that data wasn’t collected. Fascinatingly, Facebook reveals it was that it was closely monitoring how much time people spent on different media types.

Facebook Research abused the Enterprise Certificate system meant for employee-only apps

Did we break Apple’s terms of service?

Apple’s view is that we violated their terms by sideloading this app, and they decide the rules for their platform, We’ve worked with Apple to address any issues; as a result, our internal apps are back up and running. Our relationship with Apple is really important — many of us use Apple products at work every day, and we rely on iOS for many of our employee apps, so we wouldn’t put that relationship at any risk intentionally. Mark and others will be available to talk about this further at Q&A later today.

Author’s response: TechCrunch reported that Apple’s policy plainly states that the Enterprise Certificate program requires companies to “Distribute Provisioning Profiles only to Your Employees and only in conjunction with Your Internal Use Applications for the purpose of developing and testing” and that “You may not use, distribute or otherwise make Your Internal Use Applications available to Your Customers”. Apple took a firm stance in its statement that Facebook did violate the program’s policies, stating “Facebook has been using their membership to distribute a data-collecting app to consumers, which is a clear breach of their agreement with Apple.”

Given Facebook distributed the Research apps to teenagers that never signed tax forms or formal employment agreements, they were obviously not employees or contractors, and most likely use some Facebook-owned service that qualifies them as customers. Also, I’m pretty sure you can’t pay employees in gift cards.



from Apple – TechCrunch https://tcrn.ch/2S2wIEU

Apple reactivates Facebook’s employee apps after punishment for Research spying

After TechCrunch caught Facebook violating Apple’s employee-only app distribution policy to pay people for all their phone data, Apple invalidated the social network’s Enterprise Certificate as punishment. That deactivated not only this Facebook Research app VPN, but also all of Facebook’s internal iOS apps for workplace collaboration, beta testing and even getting the company lunch or bus schedule. That threw Facebook’s offices into chaos yesterday morning. Now after nearly two work days, Apple has ended Facebook’s time-out and restored its Enterprise Certification. That means employees can once again access all their office tools, pre-launch test versions of Facebook and Instagram… and the lunch menu.

A Facebook spokesperson issued this statement to TechCrunch: “We have had our Enterprise Certification, which enables our internal employee applications, restored. We are in the process of getting our internal apps up and running. To be clear, this didn’t have an impact on our consumer-facing services.”

Meanwhile, TechCrunch’s follow-up report found that Google was also violating the Enterprise Certificate program with its own “market research” VPN app called Screenwise Meter that paid people to snoop on their phone activity. After we informed Google and Apple yesterday, Google quickly apologized and took down the app. But apparently in service of consistency, this morning Apple invalidated Google’s Enterprise Certificate too, breaking its employee-only iOS apps.

Google’s internal apps are still broken. Unlike Facebook that has tons of employees on iOS, Google at least employs plenty of users of its own Android platform, so the disruption may have caused fewer problems in Mountain View than Menlo park. “We’re working with Apple to fix a temporary disruption to some of our corporate iOS apps, which we expect will be resolved soon,” said a Google spokesperson. A spokesperson for Apple said: “We are working together with Google to help them reinstate their enterprise certificates very quickly.”

TechCrunch’s investigation found that the Facebook Research app not only installed an Enterprise Certificate on users’ phones and a VPN that could collect their data, but also demanded root network access that allows Facebook to man-in-the-middle their traffic and even deencrypt secure transmissions. It paid users age 13 to 35 $10 to $20 per month to run the app so it could collect competitive intelligence on who to buy or copy. The Facebook Research app contained numerous code references to Onavo Protect, the app Apple banned and pushed Facebook to remove last August, yet Facebook kept on operating the Research data collection program.

When we first contacted Facebook, it claimed the Research app and its Enterprise Certificate distribution that sidestepped Apple’s oversight was in line with Apple’s policy. Seven hours later, Facebook announced it would shut down the Research app on iOS (though it’s still running on Android, which has fewer rules). Facebook also claimed that “there was nothing ‘secret’ about this,” challenging the characterization of our reporting. However, TechCrunch has since reviewed communications proving that the Facebook Research program threatened legal action if its users spoke publicly about the app. That sounds pretty “secret” to us.

Then we learned yesterday morning that Facebook hadn’t voluntarily pulled the app, as Apple had actually already invalidated Facebook’s Enterprise Certificate, thereby breaking the Research app and the social network’s employee tools. Apple provided this brutal statement, which it in turn applied to Google today:

We designed our Enterprise Developer Program solely for the internal distribution of apps within an organization. Facebook has been using their membership to distribute a data-collecting app to consumers, which is a clear breach of their agreement with Apple. Any developer using their enterprise certificates to distribute apps to consumers will have their certificates revoked, which is what we did in this case to protect our users and their data.

Apple is being likened to a vigilante privacy regulator overseeing Facebook and Google by The Verge’s Casey Newton and The New York Times’ Kevin Roose, perhaps with too much power, given they’re all competitors. But in this case, both Facebook and Google blatantly violated Apple’s policies to collect the maximum amount of data about iOS users, including teenagers. That means Apple was fully within its right to shut down their market research apps. Breaking their employee apps too could be seen as just collateral damage since they all use the same Enterprise Certification, or as additional punishment for violating the rules. This only becomes a real problem if Apple steps beyond the boundaries of its policies. But now, all eyes are on how it enforces its rules, whether to benefit its users or beat up on its rivals.



from Apple – TechCrunch https://tcrn.ch/2DMjaoh

Apple has blocked Google from running internal iOS apps after certificate misuse

Apple has blocked Google from distributing its internal-only iOS apps on its corporate network after a TechCrunch investigation found the search giant abusing the certificates.

“We’re working with Apple to fix a temporary disruption to some of our corporate iOS apps, which we expect will be resolved soon,” said a Google spokesperson. A spokesperson for Apple said: “We are working together with Google to help them reinstate their enterprise certificates very quickly.”

TechCrunch reported Wednesday that Google was using an Apple-issued certificate that allows the company to create and build internal apps for its staff for one of its consumer-facing apps, called Screenwise Meter, in violation of Apple’s rules. The app was designed to collect an extensive amount of data from a person’s iPhone for research, but using the special certificate allowed the company to allow users to bypass Apple’s App Store. Google later apologized, and said that the app “should not have operated under Apple’s developer enterprise program — this was a mistake.”

It followed in the footsteps of Facebook, which we first reported earlier this week that it was also abusing its internal-only certificates for a research app — which the company used to pay teenagers to vacuum up their phone’s web activity.

It’s not immediately clear how damaging this will be for Google. Not only does it mean its Screenwise Meter app won’t work for iPhones, but any other app that the search giant relies on the certificate for.

According to The Verge, many internal Google apps have also stopped working. That includes early and pre-release versions of its consumer-facing apps, like Google Maps, Hangouts, Gmail and other employee-only apps, such as its transportation apps, are no longer functioning.

Facebook faced a similar rebuke after Apple stepped in. We reported that after Apple’s ban was handed down, many of Facebook’s pre-launch, test-only versions of Facebook and Instagram stopped working, as well as other employee-only apps for coordinating office collaboration, travel, and seeing the company’s daily lunch schedule. Neither block affects apps that consumers download from Apple’s App Store.

Facebook has over 35,000 employees. Google has more than 94,000 employees.

It’s not known when — or if — Apple will issue Google or Facebook with new internal-only certificates, but they will almost certainly have newer, stricter rules attached.



from iPhone – TechCrunch https://tcrn.ch/2Wwo6VE

Apple has banned Google from running internal iOS apps after certificate misuse

Apple has blocked Google from distributing its internal-only iOS apps on its corporate network after a TechCrunch investigation found the search giant abusing the certificates.

“We’re working with Apple to fix a temporary disruption to some of our corporate iOS apps, which we expect will be resolved soon,” said a Google spokesperson.

Apple did not immediately comment on the ban.

TechCrunch reported Wednesday that Google was using an Apple-issued certificate that allows the company to create and build internal apps for its staff for one of its consumer-facing apps, called Screenwise Meter, in violation of Apple’s rules. The app was designed to collect an extensive amount of data from a person’s iPhone for research, but using the special certificate allowed the company to allow users to bypass Apple’s App Store. Google later apologized, and said that the app “should not have operated under Apple’s developer enterprise program — this was a mistake.”

It followed in the footsteps of Facebook, which we first reported earlier this week that it was also abusing its internal-only certificates for a research app — which the company used to pay teenagers to vacuum up their phone’s web activity.

It’s not immediately clear how damaging this will be for Google. Not only does it mean its Screenwise Meter app won’t work for iPhones, but any other app that the search giant relies on the certificate for.

According to The Verge, many internal Google apps have also stopped working. That includes early and pre-release versions of its consumer-facing apps, like Google Maps, Hangouts, Gmail and other employee-only apps, such as its transportation apps, are no longer functioning.

Facebook faced a similar rebuke after Apple stepped in. We reported that after Apple’s ban was handed down, many of Facebook’s pre-launch, test-only versions of Facebook and Instagram stopped working, as well as other employee-only apps for coordinating office collaboration, travel, and seeing the company’s daily lunch schedule. Neither ban affects apps that consumers download from Apple’s App Store.

Facebook has over 35,000 employees. Google has more than 94,000 employees.

It’s not known when — or if — Apple will issue Google or Facebook with new internal-only certificates, but they will almost certainly have newer, stricter rules attached.



from Apple – TechCrunch https://tcrn.ch/2Wwo6VE