Friday, 10 April 2020

Apple and Google are launching a joint COVID-19 tracing tool for iOS and Android

Apple and Google’s engineering teams have banded together to create a decentralized contact tracing tool that will help individuals determine whether they have been exposed to someone with COVID-19.

Contact tracing is a useful tool that helps public health authorities track the spread of the disease and inform the potentially exposed so that they can get tested. It does this by identifying and “following up with” people who have come into contact with a COVID-19-affected person.

The first phase of the project is an API that public health agencies can integrate into their own apps. The next phase is a system-level contact tracing system that will work across iOS and Android devices on an opt-in basis.

The system uses on-board radios on your device to transmit an anonymous ID over short ranges — using Bluetooth beaconing. Servers relay your last 14 days of rotating IDs to other devices, which search for a match. A match is determined based on a threshold of time spent and distance maintained between two devices.

If a match is found with another user that has told the system that they have tested positive, you are notified and can take steps to be tested and to self-quarantine.

Contact tracing is a well-known and debated tool, but one that has been adopted by health authorities and universities that are working on multiple projects like this. One such example is MIT’s efforts to use Bluetooth to create a privacy-conscious contact tracing tool that was inspired by Apple’s Find My system. The companies say that those organizations identified technical hurdles that they were unable to overcome and asked for help.

The project was started two weeks ago by engineers from both companies. One of the reasons the companies got involved is that there is poor interoperability between systems on various manufacturer’s devices. With contact tracing, every time you fragment a system like this between multiple apps, you limit its effectiveness greatly. You need a massive amount of adoption in one system for contact tracing to work well.

At the same time, you run into technical problems like Bluetooth power suck, privacy concerns about centralized data collection and the sheer effort it takes to get enough people to install the apps to be effective.

Two-phase plan

To fix these issues, Google and Apple teamed up to create an interoperable API that should allow the largest number of users to adopt it, if they choose.

The first phase, a private proximity contact detection API, will be released in mid-May by both Apple and Google for use in apps on iOS and Android. In a briefing today, Apple and Google said that the API is a simple one and should be relatively easy for existing or planned apps to integrate. The API would allow apps to ask users to opt-in to contact tracing (the entire system is opt-in only), allowing their device to broadcast the anonymous, rotating identifier to devices that the person “meets.” This would allow tracing to be done to alert those who may come in contact with COVID-19 to take further steps.

The value of contact tracing should extend beyond the initial period of pandemic and into the time when self-isolation and quarantine restrictions are eased.

The second phase of the project is to bring even more efficiency and adoption to the tracing tool by bringing it to the operating system level. There would be no need to download an app, users would just opt-in to the tracing right on their device. The public health apps would continue to be supported, but this would address a much larger spread of users.

This phase, which is slated for the coming months, would give the contract tracing tool the ability to work at a deeper level, improving battery life, effectiveness and privacy. If its handled by the system, then every improvement in those areas — including cryptographic advances — would benefit the tool directly.

How it works

A quick example of how a system like this might work:

  1. Two people happen to be near each other for a period of time, let’s say 10 minutes. Their phones exchange the anonymous identifiers (which change every 15 minutes).
  2. Later on, one of those people is diagnosed with COVID-19 and enters it into the system via a Public Health Authority app that has integrated the API.
  3. With an additional consent, the diagnosed user allows his anonymous identifiers for the last 14 days to be transmitted to the system.
  4. The person they came into contact with has a Public Health app on their phone that downloads the broadcast keys of positive tests and alerts them to a match.
  5. The app gives them more information on how to proceed from there.

Privacy and transparency

Both Apple and Google say that privacy and transparency are paramount in a public health effort like this one and say they are committed to shipping a system that does not compromise personal privacy in any way. This is a factor that has been raised by the ACLU, which has cautioned that any use of cell phone tracking to track the spread of COVID-19 would need aggressive privacy controls.

There is zero use of location data, which includes users who report positive. This tool is not about where affected people are but instead whether they have been around other people.

The system works by assigning a random, rotating identifier to a person’s phone and transmitting it via Bluetooth to nearby devices. That identifier, which rotates every 15 minutes and contains no personally identifiable information, will pass through a simple relay server that can be run by health organizations worldwide.

Even then, the list of identifiers you’ve been in contact with doesn’t leave your phone unless you choose to share it. Users that test positive will not be identified to other users, Apple or Google. Google and Apple can disable the broadcast system entirely when it is no longer needed.

All identification of matches is done on your device, allowing you to see — within a 14-day window — whether your device has been near the device of a person who has self-identified as having tested positive for COVID-19.

The entire system is opt-in. Users will know upfront that they are participating, whether in app or at a system level. Public health authorities are involved in notifying users that they have been in contact with an affected person. Apple and Google say that they will openly publish information about the work that they have done for others to analyze in order to bring the most transparency possible to the privacy and security aspects of the project.

“All of us at Apple and Google believe there has never been a more important moment to work together to solve one of the world’s most pressing problems,” the companies said in a statement. “Through close cooperation and collaboration with developers, governments and public health providers, we hope to harness the power of technology to help countries around the world slow the spread of COVID-19 and accelerate the return of everyday life.”

You can find more information about the contact tracing API on Google’s post here and on Apple’s page here including specifications.



from Android – TechCrunch https://ift.tt/2XpW2Gh
via IFTTT

Apple and Google are launching a joint COVID-19 tracing tool

Apple and Google’s engineering teams have banded together to create a decentralized contact tracing tool that will help individuals determine whether they have been exposed to someone with COVID-19.

Contact tracing is a useful tool that helps public health authorities track the spread of the disease and inform the potentially exposed so that they can get tested. It does this by identifying and ‘following up with’ people who have come into contact with a COVID-19 affected person.

The first phase of the project is an API that public health agencies can integrate into their own apps. The next phase is a system level contact tracing system that will work across iOS and Android devices on an opt-in basis.

The system uses on-board radios on your device to transmit an anonymous ID over short ranges — using Bluetooth beaconing. Servers relay your last 14 days of rotating IDs to other devices which search for a match. A match is determined based on a threshold of time spent and distance maintained between two devices.

If a match is found with another user that has told the system that they have tested positive, you are notified and can take steps to be tested and to self quarantine.

Contact tracing is a well known and debated tool, but one that has been adopted by health authorities and universities who are working on multiple projects like this. One such example is MIT’s efforts to use Bluetooth to create a privacy-conscious contact tracing tool that was inspired by Apple’s Find My system. The companies say that those organizations identified technical hurdles that they were unable to overcome and asked for help.

The project was started two weeks ago by engineers from both companies. One of the reasons that the companies got involved is that there is poor interoperability between systems on various manufacturer’s devices. With contact tracing, every time you fragment a system like this between multiple apps, you limit its effectiveness greatly. You need a massive amount of adoption in one system for contact tracing to work well.

At the same time, you run into technical problems like Bluetooth power suck, privacy concerns about centralized data collection and the sheer effort it takes to get enough people to install the apps to be effective.

Two Phase Plan

To fix these issues, Google and Apple teamed up to create an interoperable API that should allow the largest number of users to adopt it, if they choose.

The first phase, a private proximity contact detection API, will be released in mid-May by both Apple and Google for use in apps on iOS and Android. In a briefing today, Apple and Google said that the API is a simple one and should be relatively easy for existing or planned apps to integrate. The API would allow apps to ask users to opt-in to contact tracing (the entire system is opt-in only), allowing their device to broadcast the anonymous, rotating identifier to devices that the person ‘meets’. This would allow tracing to be done to alert those who may come in contact with COVID-19 to take further steps.

The value of contact tracing should extend beyond the initial period of pandemic and into the time when self-isolation and quarantine restrictions are eased.

The second phase of the project is to bring even more efficiency and adoption to the tracing tool by bringing it to the operating system level. There would be no need to download an app, users would just opt-in to the tracing right on their device. The public health apps would continue to be supported, but this would address a much larger spread of users.

This phase, which is slated for the coming months, would give the contract tracing tool the ability to work at a deeper level, improving battery life, effectiveness and privacy. If its handled by the system, then every improvement in those areas — including cryptographic advances — would benefit the tool directly.

How it works

A quick example of how a system like this might work.

  1. Two people happen to be near each other for a period of time, let’s say 10 minutes. Their phones exchange the anonymous identifiers (which change every 15 minutes).
  2. Later on, one of those people is diagnosed with COVID-19 and enters it into the system via a Public Health Authority app that has integrated the API.
  3. With an additional consent, the diagnosed user allows his anonymous identifiers for the last 14 days to be transmitted to the system.
  4. The person they came into contact with has a Public Health app on their phone that downloads the broadcast keys of positive tests and alerts them to a match.
  5. The app gives them more information on how to proceed from there.

Privacy and Transparency

Both Apple and Google say that privacy and transparency are paramount in a public health effort like this one and say they are committed to shipping a system that does not compromise personal privacy in any way.

There is zero use of location data, which includes users who report positive. This tool is not about where affected people are but instead whether they have been around other people.

The system works by assigning a random, rotating identifier to a person’s phone and transmitting it via Bluetooth to nearby devices. That identifier, which rotates every 15 minutes and contains no personally identifiable information, will pass through a simple relay server that can be run by health organizations worldwide.

Even then, the list of identifiers you’ve been in contact with doesn’t leave your phone unless you choose to share it. Users that test positive will not be identified to other users, Apple or Google. Google and Apple can disable the broadcast system entirely when it is no longer needed.

All identification of matches is done on your device, allowing you to see — within a 14-day window — whether your device has been near the device of a person who has self-identified as having tested positive for COVID-19.

The entire system is opt-in. Users will know up front that they are participating, whether in app or at a system level. Public health authorities are involved in notifying users that they have been in contact with an affected person. Apple and Google say that they will openly publish information about the work that they have done for others to analyze in order to bring the most transparency possible to the privacy and security aspects of the project.

“All of us at Apple and Google believe there has never been a more important moment to work together to solve one of the world’s most pressing problems,” the companies said in a statement. “Through close cooperation and collaboration with developers, governments and public health providers, we hope to harness the power of technology to help countries around the world slow the spread of COVID-19 and accelerate the return of everyday life.”

You can find more information about the contact tracing API on Apple’s page here including specifications.



from Apple – TechCrunch https://ift.tt/2XpW2Gh

Thursday, 9 April 2020

Report: Apple’s iOS 14 contains code that would let you sample apps before download

Apple has under development a feature that would allow iOS users to interact with a third-party app, even if the app wasn’t yet installed on your device, according to a report from 9to5Mac. The report is based on information discovered in the iOS 14 code, which is not necessarily an indication of launch plans on Apple’s part — but rather an insight into some of Apple’s work in progress.

The feature is referenced internally as the “Clips” API — not to be confused with Apple’s video editing app of the same name. Based on 9to5Mac’s analysis, the new API works in conjunction with the QR Code reader, allowing a user to scan a code linked to an app, then interact with that app from a card that appears on their screen.

Described like this, the feature sounds like a marketing tool for app publishers, as it would offer a way for users to try out new apps before they download them to get a better feel for the experience than a banner ad would allow. In addition to offering some interactivity with an app before it’s downloaded, the card could also be used to redirect users to the App Store if they choose to download the full version. The card could also be used to open the app directly to the content, in the case of apps the user already had installed.

Google’s Android, the report noted, offers a similar feature called “Slices,” launched in 2018. While Google had already introduced a way to interact with small pieces of an app in an experience called Instant Apps, the newer Slices feature was meant to drive usage of apps — like booking a ride or hotel room, for example, without having to first locate the app and launch it. On iOS, perhaps, these app “clips” could be pulled up by Siri or in Spotlight search — but that functionality wasn’t demonstrated by the code the report referenced today.

It’s unclear what Apple’s intentions are with the Clips API or how experimental its efforts are at this time.

However, the report found the feature was being tested with OpenTable, Yelp, DoorDash, Sony (the PS4 Second Screen app) and YouTube. This could indicate a plan to demo examples of the app’s functionality in a future reveal to developers.



from Android – TechCrunch https://ift.tt/2Xofi6U
via IFTTT

Double emerges from stealth with $6M to pair CEOs with remote assistants

CEOs often rely on executive assistants to handle the less glamorous logistics of their day so they can focus on managing a company, but hiring a full-time assistant isn’t always easy to justify, especially at a budding startup.

Double is aiming to cater to busy C-suite execs who probably don’t need a full-time assistant but could still use some help managing their email, arranging travel, scheduling meetings and balancing their endless work with a personal life. They’re pitching a service to startup CEOs and investors that matches them up with contracted remote assistants to help free up their schedules.

“At the end of the day, these people are spending hours a day doing the things they aren’t best at,” CEO Alice Default told TechCrunch in an interview.

Double’s contracted assistants are all based in the US and have years of previous experiences as EAs, Double says. When an exec signs up for the service, they are guided through an onboarding call where they can share some of their needs before being paired up with a dedicated assistant. Double says its assistants are generally working with about 4-5 clients at a time and in some cases are assisting multiple execs at the same company.

The New York startup has been building their product under wraps and has raised some $6 million in funding from VCs including Index Ventures and Paris-based Daphni. The team previously helped build the popular Sunrise calendar app, which Microsoft bought in 2015 only to later discontinue.

One of Double’s big initiatives is honing the effectiveness of combining human efforts and software automation. The team hasn’t pushed too heavily on the latter, but Default says that they see plenty of room to augment how assistants handle tasks by letting automation get the ball rolling.

“We are thinking about automation quite a bit, for us this relationship with [human labor] can be much better,” Default says.

Double has spent the last couple years developing software to facilitate the connection between assistants and executives. The team now offers desktop and mobile apps as well as a Chrome extension that can allow execs to push updates to their assistants with ease. At this point, the service is iOS-only and requires a G Suite account so no dice at the moment for Outlook or Android users.

“What we realized pretty early on is that one of the things that’s hard about delegating is giving the proper context,” Default says.

The service charges hourly rates with a minimum rate of $250 per month for 5 hours of assistant work. Default says early CEOs that have been onboarded to the service in beta pay on average about $800 month for a bit less than an hour of assistance per day.

Launching a premium service for executives in the midst of a pandemic crisis where a good deal of startups are thinking about layoffs is far from perfect launch timing for Double, but Default believes the service can provide a lot of value to busy executives scrambling to adapt their businesses. Default says the service has already seen some early users pause their subscriptions but notes that the month-to-month structure is flexible by design and makes it easy for users to pick things back up when their firms (hopefully) emerge from crisis mode.



from Android – TechCrunch https://ift.tt/34pMpJc
via IFTTT

Report: Apple’s iOS 14 contains code that would let you sample apps before download

Apple has under development a feature that would allow iOS users to interact with a third-party app, even if the app wasn’t yet installed on your device, according to a report from 9to5Mac. The report is based on information discovered in the iOS 14 code, which is not necessarily an indication of launch plans on Apple’s part — but rather an insight into some of Apple’s work in progress.

The feature is referenced internally as the “Clips” API — not to be confused with Apple’s video editing app of the same name. Based on 9to5Mac’s analysis, the new API works in conjunction with the QR Code reader, allowing a user to scan a code linked to an app, then interact with that app from a card that appears on their screen.

Described like this, the feature sounds like a marketing tool for app publishers as it would offer a way for users to try out new apps before they download them to get a better feel for the experience than a banner ad would allow. In addition to offering some interactivity with an app before it’s downloaded, the card could also be used to redirect users to the App Store if they choose to download the full version. The card could also be used to open the app directly to the content, in the case of apps the user already had installed.

Google’s Android, the report noted, offers a similar feature called “Slices,” launched in 2018. While Google had already introduced a way to interact with small pieces of an app in an experience called Instant Apps, the newer Slices feature was meant to drive usage of apps — like booking a ride or hotel room, for example, without having to first locate the app and launch it. On iOS, perhaps, these app “clips” could be pulled up by Siri or in Spotlight search — but that functionality wasn’t demonstrated by the code the report referenced today.

It’s unclear what Apple intentions are with the Clips API or how experimental its efforts are at this time.

However, the report found the feature was being tested with OpenTable, Yelp, DoorDash, Sony (the PS4 Second Screen app), and YouTube. This could indicate a plan to demo examples of the app’s functionality in a future reveal to developers.



from Apple – TechCrunch https://ift.tt/2Xofi6U

Android gets a built-in Braille keyboard

Android has received a wealth of accessibility features over the last couple years, but one that has been left to third party developers is a way for blind users to type using braille. That changes today with Android’s new built-in braille keyboard, which should soon be available as an option on all phones running version 5 and up of the OS.

Braille is a complex topic in the accessibility community, as in many ways it has been supplanted by voice recognition, screen readers, and other tools. But many people are already familiar with it and use it regularly — and after all, one can’t always chat out loud.

Third-party braille keyboards are available, but some cost money or are no longer in development. And since the keyboard essentially has access to everything you type, there are security considerations as well. So it’s best for the keyboard you use to be an official one from a reputable company. Google will have to do!

The new keyboard, the company writes in a blog post, was created as a collaboration with various users and developers of braille software, and should be familiar to anyone who’s used something like it in the past.

The user holds the phone in landscape mode, with the screen facing away from them, and taps the regions corresponding to each of the six dots that form letters in the braille alphabet. It works with Android’s TalkBack function, which reads off words the user types or selects, so like any other writing method errors can be quickly detected and corrected. There are also some built-in gestures for quickly deleting letters and words or sending the text to the recipient or selected field.

Instructions for activating the braille keyboard are here. Right now it’s only available in English, but more languages will likely be added in the near future.



from Android – TechCrunch https://ift.tt/39VCNHj
via IFTTT

MIT develops privacy-preserving COVID-19 contact tracing inspired by Apple’s ‘Find My’ feature

One of the efforts that’s been proposed to contain the spread of COVID-19 is a contact trace and track program, that would allow health officials to keep better tabs on individuals who have been infected, and alert them to potential spread. Contract tracing has already seemingly proven effective in some parts of the world that have managed to curb the coronavirus spread, but privacy advocates have big reservations about any such system’s implementation in the U.S.

There are a number of proposals of how to implement a contact tracing system that preserves privacy, including a decentralization proposal for a group of European experts. In the U.S., MIT researchers have devised a new method to would provide automated contact tracing that taps into the Bluetooth signals sent out by everyone’s mobile devices, tying contacts to random numbers that aren’t linked to an individual’s identity in any way.

The system works by having each mobile device constantly be sending out random strings of numbers that the the researchers liken to “chirps” (though not actually audible). These are sent via Bluetooth, which is key for a couple of reasons, including that most people have Bluetooth enabled on their device all the time, and that it’s a short-range radio communication protocol that ensures any reception of a “chirp” came from someone you were in relatively close contact to.

If any person tests positive for COVID-19, they can then upload a full list of the chirps that their phone has broadcast over the past 14 days (which at the outside, should represent the full time they’ve been contagious). Those go into a database of chirps associated with confirmed positive cases, which others can scan against to see if their phone has received one of those chirps during that time. A positive match with one of those indicates that an individual could be at risk, since they were at least within 40 feet or so of a person who has the virus, and it’s a good indicator that they should seek a test if available, or at least self-quarantine for the recommended two-week period.

MIT’s system sidesteps entirely many of the thorniest privacy-related issues around contact tracing, which have been discussed in detail by the ACLU and other privacy protection organizations: It doesn’t use any geolocation information at all, nor does it connect any diagnosis or other information to a particular individual. It’s still not entirely left to individual discretion, which would be a risk from the perspective of ensuring compliance, because MIT envisions a health official providing a QR code along with delivering any positive diagnosis that would trigger the upload of a person’s chirp history to the database.

The system would work through an app they install on their phone, and its design was inspired by Apple’s “Find My” system for locating lost Mac and IOS hardware, as well as keeping track of the location of devices owned by loved ones. Find My also uses chirps to broadcast locations to passing Apple hardware.

“Find My inspired this system,” ays Marc Zissman, the associate head of MIT Lincoln Laboratory’s Cyber Security and Information Science Division and co-principal investigator of the project in a blog post describing the research. “If my phone is lost, it can start broadcasting a Bluetooth signal that’s just a random number; it’s like being in the middle of the ocean and waving a light. If someone walks by with Bluetooth enabled, their phone doesn’t know anything about me; it will just tell Apple, ‘Hey, I saw this light.’”

The system could be adapted to automate check-ins against the positive chirp database, and provide alerts to individuals who should get tested or self-isolate. Researchers worked closely with public health officials to ensure that this will suit their needs and goals as well as preserving privacy.

MIT’s team says that a critical next step to making this actually work broadly is to get Apple, Google and Microsoft on board with the plan. This requires close collaboration with mobile device platform operators to work effectively, they note. Extrapolating a step further, were iOS and Android to offer these as built-in features, that would go a long way towards encouraging widespread adoption.



from Apple – TechCrunch https://ift.tt/2Xyg1Tt