Tuesday, 26 May 2020

India’s contact tracing app is going open source

India said it will publicly release the source code of its contact tracing app, Aarogya Setu, in a relief to privacy and security experts who have been advocating for this ever since the app launched in early April. 

Ajay Prakash Sawhney, secretary in the ministry of electronics and information technology, made the announcement on Tuesday, slating it as “opening the heart” of Aarogya Setu’s Android app, which has amassed over 115 million users in fewer than 60 days, to allow engineers to inspect and tinker with the code. The source code will be published on GitHub at midnight Tuesday (local time).

Sawhney said the government will also offer cash prize of up to $1,325 for identifying and reporting bugs and vulnerabilities in the code of Aarogya Setu.

More to follow…

 



from Android – TechCrunch https://ift.tt/2XyIKpd
via IFTTT

A new Android bug, Strandhogg 2.0, lets malware pose as real apps and steal user data

Security researchers have found a major vulnerability in almost every version of Android, which lets malware imitate legitimate apps to steal app passwords and other sensitive data.

The vulnerability, dubbed Strandhogg 2.0 (named after the Norse term for a hostile takeover) affects all devices running Android 9.0 and earlier. It’s the “evil twin” to an earlier bug of the same name, according to Norwegian security firm Promon, which discovered both vulnerabilities six months apart. Strandhogg 2.0 works by tricking a victim into thinking they’re entering their passwords on a legitimate app while instead interacting with a malicious overlay. Strandhogg 2.0 can also hijack other app permissions to siphon off sensitive user data, like contacts, photos, and track a victim’s real-time location.

The bug is said to be more dangerous than its predecessor because it’s “nearly undetectable,” Tom Lysemose Hansen, founder and chief technology officer at Promon, told TechCrunch.

The good news is that Promon said it has no evidence that hackers have used the bug in active hacking campaigns. The caveat is that there are “no good ways” to detect an attack. Fearing the bug could still be abused by hackers, Promon delayed releasing details of the bug until Google could fix the “critical”-rated vulnerability.

A spokesperson for Google told TechCrunch that the company also saw no evidence of active exploitation. “We appreciate the work of the researchers, and have released a fix for the issue they identified.” The spokesperson said Google Play Protect, an app screening service built-in to Android devices, blocks apps that exploit the Strandhogg 2.0 vulnerability.

Standhogg 2.0 works by abusing Android’s multitasking system, which keeps tabs on every recently opened app so that the user can quickly switch back and forth. A victim would have to download a malicious app — disguised as a normal app — that can exploit the Strandhogg 2.0 vulnerability. Once installed and when a victim opens a legitimate app, the malicious app quickly hijacks the app and injects malicious content in its place, such as a fake login window.

When a victim enters their password on the fake overlay, their passwords are siphoned off to the hacker’s servers. The real app then appears as though the login was real.

Strandhogg 2.0 doesn’t need any Android permissions to run, but it can also hijack the permissions of other apps that have access to a victim’s contacts, photos, and messages by triggering a permissions request.

“If the permission is granted, then the malware now has this dangerous permission,” said Hansen.

Once that permission is granted, the malicious app can upload data from a user’s phone. The malware can upload entire text message conversations, said Hansen, allowing the hackers to defeat two-factor authentication protections.

The risk to users is likely low, but not zero. Promon said updating Android devices with the latest security updates — out now — will fix the vulnerability. Users are advised to update their Android devices as soon as possible.



from Android – TechCrunch https://ift.tt/2X3jWa8
via IFTTT

Saturday, 23 May 2020

Hackers release a new jailbreak that unlocks every iPhone

A renowned iPhone hacking team has released a new “jailbreak” tool that unlocks every iPhone, even the most recent models running the latest iOS 13.5.

For as long as Apple has kept up its “walled garden” approach to iPhones by only allowing apps and customizations that it approves, hackers have tried to break free from what they call the “jail,” hence the name “jailbreak.” Hackers do this by finding a previously undisclosed vulnerability in iOS that break through some of the many restrictions that Apple puts in place to prevent access to the underlying software. Apple says it does this for security. But jailbreakers say breaking through those restrictions allows them to customize their iPhones more than they would otherwise, in a way that most Android users are already accustomed to.

The jailbreak, released by the unc0ver team, supports all iPhones that run iOS 11 and above, including up to iOS 13.5, which Apple released this week.

Details of the vulnerability that the hackers used to build the jailbreak aren’t known, but it’s not expected to last forever. Just as jailbreakers work to find a way in, Apple works fast to patch the flaws and close the jailbreak.

Security experts typically advise iPhone users against jailbreaking, because breaking out of the “walled garden” vastly increases the surface area for new vulnerabilities to exist and to be found.

The jailbreak comes at a time where the shine is wearing off of Apple’s typically strong security image. Last week, Zerodium, a broker for exploits, said it would no longer buy certain iPhone vulnerabilities because there were too many of them. Motherboard reported this week that hackers got their hands on a pre-release version of the upcoming iOS 14 release several months ago.



from iPhone – TechCrunch https://ift.tt/3bVIBkT

Hackers release a new jailbreak that unlocks every iPhone

A renowned iPhone hacking team has released a new “jailbreak” tool that unlocks every iPhone, even the most recent models running the latest iOS 13.5.

For as long as Apple has kept up its “walled garden” approach to iPhones by only allowing apps and customizations that it approves, hackers have tried to break free from what they call the “jail,” hence the name “jailbreak.” Hackers do this by finding a previously undisclosed vulnerability in iOS that break through some of the many restrictions that Apple puts in place to prevent access to the underlying software. Apple says it does this for security. But jailbreakers say breaking through those restrictions allows them to customize their iPhones more than they would otherwise, in a way that most Android users are already accustomed to.

The jailbreak, released by the unc0ver team, supports all iPhones that run iOS 11 and above, including up to iOS 13.5, which Apple released this week.

Details of the vulnerability that the hackers used to build the jailbreak aren’t known, but it’s not expected to last forever. Just as jailbreakers work to find a way in, Apple works fast to patch the flaws and close the jailbreak.

Security experts typically advise iPhone users against jailbreaking, because breaking out of the “walled garden” vastly increases the surface area for new vulnerabilities to exist and to be found.

The jailbreak comes at a time where the shine is wearing off of Apple’s typically strong security image. Last week, Zerodium, a broker for exploits, said it would no longer buy certain iPhone vulnerabilities because there were too many of them. Motherboard reported this week that hackers got their hands on a pre-release version of the upcoming iOS 14 release several months ago.



from Apple – TechCrunch https://ift.tt/3bVIBkT

This Week in Apps: Facebook takes on Shopify, Tinder considers its future, contact-tracing tech goes live

Welcome back to This Week in Apps, the Extra Crunch series that recaps the latest OS news, the applications they support and the money that flows through it all.

The app industry is as hot as ever, with a record 204 billion downloads and $120 billion in consumer spending in 2019. People are now spending three hours and 40 minutes per day using apps, rivaling TV. Apps aren’t just a way to pass idle hours — they’re a big business. In 2019, mobile-first companies had a combined $544 billion valuation, 6.5x higher than those without a mobile focus.

In this Extra Crunch series, we help you keep up with the latest news from the world of apps, delivered on a weekly basis.

This week we’re continuing to look at how the coronavirus outbreak is impacting the world of mobile applications. Notably, we saw the launch of the Apple/Google exposure-notification API with the latest version of iOS out this week. The pandemic is also inspiring other new apps and features, including upcoming additions to Apple’s Schoolwork, which focus on distance learning, as well as Facebook’s new Shops feature designed to help small business shift their operations online in the wake of physical retail closures.

Tinder, meanwhile, seems to be toying with the idea of pivoting to a global friend finder and online hangout in the wake of social distancing, with its test of a feature that allows users to match with others worldwide — meaning, with no intention of in-person dating.

Headlines

COVID-19 apps in the news

  • Fitbit app: The fitness tracker app launched a COVID-19 early detection study aimed at determining whether wearables can help detect COVID-19 or the flu. The study will ask volunteers questions about their health, including whether they had COVID-19, then pair that with activity data to see if there are any clues that could be used to build an early warning algorithm of sorts.
  • U.K. contact-tracing app: The app won’t be ready in mid-May as promised, as the government mulls the use of the Apple/Google API. In testing, the existing app drains the phone battery too quickly. In addition, researchers have recently identified seven security flaws in the app, which is currently being trialed on the Isle of Wight.

Apple launches iOS/iPadOS 13.5 with Face ID tweak and contact-tracing API

Apple this week released the latest version of iOS/iPadOS with two new features related to the pandemic. The first is an update to Face ID which will now be able to tell when the user is wearing a mask. In those cases, Face ID will instead switch to the Passcode field so you can type in your code to unlock your phone, or authenticate with apps like the App Store, Apple Books, Apple Pay, iTunes and others.

The other new feature is the launch of the exposure-notification API jointly developed by Apple and Google. The API allows for the development of apps from public health organizations and governments that can help determine if someone has been exposed by COVID-19. The apps that support the API have yet to launch, but some 22 countries have requested API access.



from Android – TechCrunch https://ift.tt/2ZvkdUU
via IFTTT

Friday, 22 May 2020

The FBI is mad because it keeps getting into locked iPhones without Apple’s help

The debate over encryption continues to drag on without end.

In recent months, the discourse has largely swung away from encrypted smartphones to focus instead on end-to-end encrypted messaging. But a recent press conference by the heads of the Department of Justice (DOJ) and the Federal Bureau of Investigation (FBI) showed that the debate over device encryption isn’t dead, it was merely resting. And it just won’t go away.

At the presser, Attorney General William Barr and FBI Director Chris Wray announced that after months of work, FBI technicians had succeeded in unlocking the two iPhones used by the Saudi military officer who carried out a terrorist shooting at the Pensacola Naval Air Station in Florida in December 2019. The shooter died in the attack, which was quickly claimed by Al Qaeda in the Arabian Peninsula.

Early this year — a solid month after the shooting — Barr had asked Apple to help unlock the phones (one of which was damaged by a bullet), which were older iPhone 5 and 7 models. Apple provided “gigabytes of information” to investigators, including “iCloud backups, account information and transactional data for multiple accounts,” but drew the line at assisting with the devices. The situation threatened to revive the 2016 “Apple versus FBI” showdown over another locked iPhone following the San Bernardino terror attack.

After the government went to federal court to try to dragoon Apple into doing investigators’ job for them, the dispute ended anticlimactically when the government got into the phone itself after purchasing an exploit from an outside vendor the government refused to identify. The Pensacola case culminated much the same way, except that the FBI apparently used an in-house solution instead of a third party’s exploit.

You’d think the FBI’s success at a tricky task (remember, one of the phones had been shot) would be good news for the Bureau. Yet an unmistakable note of bitterness tinged the laudatory remarks at the press conference for the technicians who made it happen. Despite the Bureau’s impressive achievement, and despite the gobs of data Apple had provided, Barr and Wray devoted much of their remarks to maligning Apple, with Wray going so far as to say the government “received effectively no help” from the company.

This diversion tactic worked: in news stories covering the press conference, headline after headline after headline highlighted the FBI’s slam against Apple instead of focusing on what the press conference was nominally about: the fact that federal law enforcement agencies can get into locked iPhones without Apple’s assistance.

That should be the headline news, because it’s important. That inconvenient truth undercuts the agencies’ longstanding claim that they’re helpless in the face of Apple’s encryption and thus the company should be legally forced to weaken its device encryption for law enforcement access. No wonder Wray and Barr are so mad that their employees keep being good at their jobs.

By reviving the old blame-Apple routine, the two officials managed to evade a number of questions that their press conference left unanswered. What exactly are the FBI’s capabilities when it comes to accessing locked, encrypted smartphones? Wray claimed the technique developed by FBI technicians is “of pretty limited application” beyond the Pensacola iPhones. How limited? What other phone-cracking techniques does the FBI have, and which handset models and which mobile OS versions do those techniques reliably work on? In what kinds of cases, for what kinds of crimes, are these tools being used?

We also don’t know what’s changed internally at the Bureau since that damning 2018 Inspector General postmortem on the San Bernardino affair. Whatever happened with the FBI’s plans, announced in the IG report, to lower the barrier within the agency to using national security tools and techniques in criminal cases? Did that change come to pass, and did it play a role in the Pensacola success? Is the FBI cracking into criminal suspects’ phones using classified techniques from the national security context that might not pass muster in a court proceeding (were their use to be acknowledged at all)?

Further, how do the FBI’s in-house capabilities complement the larger ecosystem of tools and techniques for law enforcement to access locked phones? Those include third-party vendors GrayShift and Cellebrite’s devices, which, in addition to the FBI, count numerous U.S. state and local police departments and federal immigration authorities among their clients. When plugged into a locked phone, these devices can bypass the phone’s encryption to yield up its contents, and (in the case of GrayShift) can plant spyware on an iPhone to log its passcode when police trick a phone’s owner into entering it. These devices work on very recent iPhone models: Cellebrite claims it can unlock any iPhone for law enforcement, and the FBI has unlocked an iPhone 11 Pro Max using GrayShift’s GrayKey device.

In addition to Cellebrite and GrayShift, which have a well-established U.S. customer base, the ecosystem of third-party phone-hacking companies includes entities that market remote-access phone-hacking software to governments around the world. Perhaps the most notorious example is the Israel-based NSO Group, whose Pegasus software has been used by foreign governments against dissidents, journalists, lawyers and human rights activists. The company’s U.S. arm has attempted to market Pegasus domestically to American police departments under another name. Which third-party vendors are supplying phone-hacking solutions to the FBI, and at what price?

Finally, who else besides the FBI will be the beneficiary of the technique that worked on the Pensacola phones? Does the FBI share the vendor tools it purchases, or its own home-rolled ones, with other agencies (federal, state, tribal or local)? Which tools, which agencies and for what kinds of cases? Even if it doesn’t share the techniques directly, will it use them to unlock phones for other agencies, as it did for a state prosecutor soon after purchasing the exploit for the San Bernardino iPhone?

We have little idea of the answers to any of these questions, because the FBI’s capabilities are a closely held secret. What advances and breakthroughs it has achieved, and which vendors it has paid, we (who provide the taxpayer dollars to fund this work) aren’t allowed to know. And the agency refuses to answer questions about encryption’s impact on its investigations even from members of Congress, who can be privy to confidential information denied to the general public.

The only public information coming out of the FBI’s phone-hacking black box is nothingburgers like the recent press conference. At an event all about the FBI’s phone-hacking capabilities, Director Wray and AG Barr cunningly managed to deflect the press’s attention onto Apple, dodging any difficult questions, such as what the FBI’s abilities mean for Americans’ privacy, civil liberties and data security, or even basic questions like how much the Pensacola phone-cracking operation cost.

As the recent PR spectacle demonstrated, a press conference isn’t oversight. And instead of exerting its oversight power, mandating more transparency, or requiring an accounting and cost/benefit analysis of the FBI’s phone-hacking expenditures — instead of demanding a straight and conclusive answer to the eternal question of whether, in light of the agency’s continually-evolving capabilities, there’s really any need to force smartphone makers to weaken their device encryption — Congress is instead coming up with dangerous legislation such as the EARN IT Act, which risks undermining encryption right when a population forced by COVID-19 to do everything online from home can least afford it.

The bestcase scenario now is that the federal agency that proved its untrustworthiness by lying to the Foreign Intelligence Surveillance Court can crack into our smartphones, but maybe not all of them; that maybe it isn’t sharing its toys with state and local police departments (which are rife with domestic abusers who’d love to get access to their victims’ phones); that unlike third-party vendor devices, maybe the FBI’s tools won’t end up on eBay where criminals can buy them; and that hopefully it hasn’t paid taxpayer money to the spyware company whose best-known government customer murdered and dismembered a journalist.

The worst-case scenario would be that, between in-house and third-party tools, pretty much any law enforcement agency can now reliably crack into everybody’s phones, and yet nevertheless this turns out to be the year they finally get their legislative victory over encryption anyway. I can’t wait to see what else 2020 has in store.



from iPhone – TechCrunch https://ift.tt/2ZstkFH

The FBI is mad because it keeps getting into locked iPhones without Apple’s help

The debate over encryption continues to drag on without end.

In recent months, the discourse has largely swung away from encrypted smartphones to focus instead on end-to-end encrypted messaging. But a recent press conference by the heads of the Department of Justice (DOJ) and the Federal Bureau of Investigation (FBI) showed that the debate over device encryption isn’t dead, it was merely resting. And it just won’t go away.

At the presser, Attorney General William Barr and FBI Director Chris Wray announced that after months of work, FBI technicians had succeeded in unlocking the two iPhones used by the Saudi military officer who carried out a terrorist shooting at the Pensacola Naval Air Station in Florida in December 2019. The shooter died in the attack, which was quickly claimed by Al Qaeda in the Arabian Peninsula.

Early this year — a solid month after the shooting — Barr had asked Apple to help unlock the phones (one of which was damaged by a bullet), which were older iPhone 5 and 7 models. Apple provided “gigabytes of information” to investigators, including “iCloud backups, account information and transactional data for multiple accounts,” but drew the line at assisting with the devices. The situation threatened to revive the 2016 “Apple versus FBI” showdown over another locked iPhone following the San Bernardino terror attack.

After the government went to federal court to try to dragoon Apple into doing investigators’ job for them, the dispute ended anticlimactically when the government got into the phone itself after purchasing an exploit from an outside vendor the government refused to identify. The Pensacola case culminated much the same way, except that the FBI apparently used an in-house solution instead of a third party’s exploit.

You’d think the FBI’s success at a tricky task (remember, one of the phones had been shot) would be good news for the Bureau. Yet an unmistakable note of bitterness tinged the laudatory remarks at the press conference for the technicians who made it happen. Despite the Bureau’s impressive achievement, and despite the gobs of data Apple had provided, Barr and Wray devoted much of their remarks to maligning Apple, with Wray going so far as to say the government “received effectively no help” from the company.

This diversion tactic worked: in news stories covering the press conference, headline after headline after headline highlighted the FBI’s slam against Apple instead of focusing on what the press conference was nominally about: the fact that federal law enforcement agencies can get into locked iPhones without Apple’s assistance.

That should be the headline news, because it’s important. That inconvenient truth undercuts the agencies’ longstanding claim that they’re helpless in the face of Apple’s encryption and thus the company should be legally forced to weaken its device encryption for law enforcement access. No wonder Wray and Barr are so mad that their employees keep being good at their jobs.

By reviving the old blame-Apple routine, the two officials managed to evade a number of questions that their press conference left unanswered. What exactly are the FBI’s capabilities when it comes to accessing locked, encrypted smartphones? Wray claimed the technique developed by FBI technicians is “of pretty limited application” beyond the Pensacola iPhones. How limited? What other phone-cracking techniques does the FBI have, and which handset models and which mobile OS versions do those techniques reliably work on? In what kinds of cases, for what kinds of crimes, are these tools being used?

We also don’t know what’s changed internally at the Bureau since that damning 2018 Inspector General postmortem on the San Bernardino affair. Whatever happened with the FBI’s plans, announced in the IG report, to lower the barrier within the agency to using national security tools and techniques in criminal cases? Did that change come to pass, and did it play a role in the Pensacola success? Is the FBI cracking into criminal suspects’ phones using classified techniques from the national security context that might not pass muster in a court proceeding (were their use to be acknowledged at all)?

Further, how do the FBI’s in-house capabilities complement the larger ecosystem of tools and techniques for law enforcement to access locked phones? Those include third-party vendors GrayShift and Cellebrite’s devices, which, in addition to the FBI, count numerous U.S. state and local police departments and federal immigration authorities among their clients. When plugged into a locked phone, these devices can bypass the phone’s encryption to yield up its contents, and (in the case of GrayShift) can plant spyware on an iPhone to log its passcode when police trick a phone’s owner into entering it. These devices work on very recent iPhone models: Cellebrite claims it can unlock any iPhone for law enforcement, and the FBI has unlocked an iPhone 11 Pro Max using GrayShift’s GrayKey device.

In addition to Cellebrite and GrayShift, which have a well-established U.S. customer base, the ecosystem of third-party phone-hacking companies includes entities that market remote-access phone-hacking software to governments around the world. Perhaps the most notorious example is the Israel-based NSO Group, whose Pegasus software has been used by foreign governments against dissidents, journalists, lawyers and human rights activists. The company’s U.S. arm has attempted to market Pegasus domestically to American police departments under another name. Which third-party vendors are supplying phone-hacking solutions to the FBI, and at what price?

Finally, who else besides the FBI will be the beneficiary of the technique that worked on the Pensacola phones? Does the FBI share the vendor tools it purchases, or its own home-rolled ones, with other agencies (federal, state, tribal or local)? Which tools, which agencies and for what kinds of cases? Even if it doesn’t share the techniques directly, will it use them to unlock phones for other agencies, as it did for a state prosecutor soon after purchasing the exploit for the San Bernardino iPhone?

We have little idea of the answers to any of these questions, because the FBI’s capabilities are a closely held secret. What advances and breakthroughs it has achieved, and which vendors it has paid, we (who provide the taxpayer dollars to fund this work) aren’t allowed to know. And the agency refuses to answer questions about encryption’s impact on its investigations even from members of Congress, who can be privy to confidential information denied to the general public.

The only public information coming out of the FBI’s phone-hacking black box is nothingburgers like the recent press conference. At an event all about the FBI’s phone-hacking capabilities, Director Wray and AG Barr cunningly managed to deflect the press’s attention onto Apple, dodging any difficult questions, such as what the FBI’s abilities mean for Americans’ privacy, civil liberties and data security, or even basic questions like how much the Pensacola phone-cracking operation cost.

As the recent PR spectacle demonstrated, a press conference isn’t oversight. And instead of exerting its oversight power, mandating more transparency, or requiring an accounting and cost/benefit analysis of the FBI’s phone-hacking expenditures — instead of demanding a straight and conclusive answer to the eternal question of whether, in light of the agency’s continually-evolving capabilities, there’s really any need to force smartphone makers to weaken their device encryption — Congress is instead coming up with dangerous legislation such as the EARN IT Act, which risks undermining encryption right when a population forced by COVID-19 to do everything online from home can least afford it.

The bestcase scenario now is that the federal agency that proved its untrustworthiness by lying to the Foreign Intelligence Surveillance Court can crack into our smartphones, but maybe not all of them; that maybe it isn’t sharing its toys with state and local police departments (which are rife with domestic abusers who’d love to get access to their victims’ phones); that unlike third-party vendor devices, maybe the FBI’s tools won’t end up on eBay where criminals can buy them; and that hopefully it hasn’t paid taxpayer money to the spyware company whose best-known government customer murdered and dismembered a journalist.

The worst-case scenario would be that, between in-house and third-party tools, pretty much any law enforcement agency can now reliably crack into everybody’s phones, and yet nevertheless this turns out to be the year they finally get their legislative victory over encryption anyway. I can’t wait to see what else 2020 has in store.



from Apple – TechCrunch https://ift.tt/2ZstkFH