Wednesday, 15 April 2020

WorldGaze uses smartphone cameras to help voice AIs cut to the chase

If you find voice assistants frustratingly dumb you’re hardly alone. The much hyped promise of AI-driven vocal convenience very quickly falls through the cracks of robotic pedantry.

A smart AI that has to come back again (and sometimes again) to ask for extra input to execute your request can see especially dumb — when, for example, it doesn’t get that the most likely repair shop you’re asking about is not any one of them but the one you’re parked outside of right now.

Researchers at the Human-Computer Interaction Institute at Carnegie Mellon University, working with Gierad Laput a machine learning engineer at Apple, have devised a demo software add-on for voice assistants that lets smartphone users boost the savvy of an on-device AI by giving it a helping hand — or rather a helping head.

The prototype system makes simultaneous use of a smartphone’s front and rear cameras to be able to locate the user’s head in physical space, and more specifically within the immediate surroundings — which are parsed to identify objects in the vicinity using computer vision technology.

The user is then able to use their head as a pointer to direct their gaze at whatever they’re talking about — i.e. ‘that garage’ — wordlessly filling in contextual gaps in the AI’s understanding in a way the researchers contend is more natural.

So, instead of needing to talk like a robot in order to tap the utility of a voice AI, you can sound a bit more, well, human. Asking stuff like ‘Siri, when does that Starbucks close?’ Or — in a retail setting — ‘are there other color options for that sofa?’ or asking for an instant a price comparison between ‘this chair and that one’. Or for a lamp to be added to your wish-list.

In a home/office scenario, the system could also let the user remotely control a variety of devices within their field of vision — without needing to be hyper specific about it. Instead they could just look towards the smart TV or thermostat and speak the required volume/temperature adjustment.

The team has put together a demo video (below) showing the prototype — which they’ve called WorldGaze — in action. “We use the iPhone’s front-facing camera to track the head in 3D, including its direction vector. Because the geometry of the front and back cameras are known, we can raycast the head vector into the world as seen by the rear-facing camera,” they explain in the video.

“This allows the user to intuitively define an object or region of interest using the head gaze. Voice assistants can then use this contextual information to make enquiries that are more precise and natural.”

In a research paper presenting the prototype they also suggest it could be used to “help to socialize mobile AR experiences, currently typified by people walking down the street looking down at their devices”.

Asked to expand on this, CMU researcher Chris Harrison told TechCrunch: “People are always walking and looking down at their phones, which isn’t very social. They aren’t engaging with other people, or even looking at the beautiful world around them. With something like WorldGaze, people can look out into the world, but still ask questions to their smartphone. If I’m walking down the street, I can inquire and listen about restaurant reviews or add things to my shopping list without having to look down at my phone. But the phone still has all the smarts. I don’t have to buy something extra or special.”

In the paper they note there is a long body of research related to tracking users’ gaze for interactive purposes — but a key aim of their work here was to develop “a functional, real-time prototype, constraining ourselves to hardware found on commodity smartphones”. (Although the rear camera’s field of view is one potential limitation they discuss, including suggesting a partial workaround for any hardware that falls short.)

“Although WorldGaze could be launched as a standalone application, we believe it is more likely for WorldGaze to be integrated as a background service that wakes upon a voice assistant trigger (e.g., “Hey Siri”),” they also write. “Although opening both cameras and performing computer vision processing is energy consumptive, the duty cycle would be so low as to not significantly impact battery life of today’s smartphones. It may even be that only a single frame is needed from both cameras, after which they can turn back off (WorldGaze startup time is 7 sec). Using bench equipment, we estimated power consumption at ~0.1 mWh per inquiry.”

Of course there’s still something a bit awkward about a human holding a screen up in front of their face and talking to it — but Harrison confirms the software could work just as easily hands-free on a pair of smart spectacles.

“Both are possible,” he told us. “We choose to focus on smartphones simply because everyone has one (and WorldGaze could literally be a software update), while almost no one has AR glasses (yet).  But the premise of using where you are looking to supercharge voice assistants applies to both.”

“Increasingly, AR glasses include sensors to track gaze location (e.g., Magic Leap, which uses it for focusing reasons), so in that case, one only needs outwards facing cameras,” he added.”

Taking a further leap it’s possible to imagine such a system being combined with facial recognition technology — to allow a smart spec-wearer to quietly tip their head and ask ‘who’s that?’ — assuming the necessary facial data was legally available in the AI’s memory banks.

Features such as “add to contacts” or “when did we last meet” could then be unlocked, to augment a networking or socializing experience. Although, at this point, the privacy implications of unleashing such a system into the real world look rather more challenging than stitching together the engineering. (See, for example, Apple banning Clearview AI’s app for violating its rules.)

“There would have to be a level of security and permissions to go along with this, and it’s not something we are contemplating right now, but it’s an interesting (and potentially scary idea),” agrees Harrison when we ask about such a possibility.

The team was due to present the research at ACM CHI — but the conference was canceled due to the coronavirus.



from iPhone – TechCrunch https://ift.tt/2VaAM5N

Apple’s iPad Pro Magic Keyboard arrives next week, ahead of schedule

The global supply chain is currently being squeezed from all angles amid a global pandemic. At least one product, however, will arrive ahead of schedule. Originally planned for a May arrival, the iPad Pro’s new trackpad-sporting Magic Keyboard is up for preorder now and set to start shipping next week.

You can read all of the details about the accessory here, along with Matthew’s hands-on time with the product via the iPad Pro review. The gist is basically that Apple’s further blurring the lines between the iPad and MacBook with additional hardware and software productivity updates.

The peripheral harnesses the cursor and mouse support delivered via  iPadOS 13.4. The “floating” swiveled design allows for a 130-degree range of via angles, while the backlit keys use the company’s much improved scissor switch design. There’s also an additional USB-C port for charging, but not data.

Along with the new Pro, it also works with the 2018 version. It’s not cheap, however, priced at $200 for the 11-inch and $349 for the 12-inch. Apple is also working with accessory makers to offer lower-priced trackpad-sporting cases.



from Apple – TechCrunch https://ift.tt/2RGt5SS

Apple introduces new $399 iPhone SE with Touch ID and 4.7″ screen

Apple has dropped a new iPhone SE on the market today. It’s a 4.7” iPhone with a physical home button, Touch ID, a single rear-facing camera and the A13 Bionic chip on board. With a $399 starting price point, the new SE is aimed squarely at new iPhone users or first time smartphone buyers but could appeal to those who want the smallest iPhone model currently available above other considerations.

Pre-orders for the iPhone SE begin at 5:00 a.m. PDT on April 17th and it will ship on April 24th.

It comes in black, silver and Product(RED) editions and features a single rear-facing camera and a single front-facing camera. This is Apple’s new entry-level iPhone.

The overall package is pretty appealing here. It’s got the same A13 chip as in the iPhone 11 and iPhone 11 Pro and Apple tells me that the processor performance in the SE is comparable and not toned down for the more affordable unit.

The display is Apple’s Retina HD unit, which is an LCD panel. It is not a Liquid Retina display like the iPhone 11 and iPhone XR. I’m still waiting on specs to see what we’re looking at from a contrast ration perspective here, but it does have True Tone.

Probably the biggest defining feature of the iPhone SE besides its size is its inclusion of a physical home button with Touch ID instead of the Face ID system we’ve come to expect on new iPhones. It’s not clear now whether that’s due to size constraints preventing the inclusion of the needed front-facing True Depth Camera array — but pricing is probably just as likely to figure in this calculation.

Touch ID is reliable and even preferred by some users, though the physical home buttons have long been one of the biggest hardware failure points of iPhones with the feature. In our new mask-using world, though, some ground swell of Touch ID enthusiasm has been gaining. It’s hard to make Face ID systems properly recognize you behind a cloth wrap covering half of your face. This has been an issue for a while in Asia, where mask wearing has long been a matter of courtesy during allergy season or when a person is ill.

Camera and Comparisons

A couple of main items make Apple’s claim that the iPhone SE is ‘the best single-camera system’ supportable. You may recall that the iPhone XR also supported portrait mode and had the same resolution of rear camera. But with the iPhone SE, you have the A13 bionic, a new ISP and the Neural Engine that have improved things significantly in the machine learning department — allowing for segmentation masks and semantic rendering, two big improvements that make the portrait mode far more effective in recent iPhone models.

Apple only supported 3 lighting effects on the XR — the ones where you didn’t have to strip away the background. Those require more beef in the rendering and separation pipeline so the iPhone SE can do those now. The iPhone SE also has the improved Smart HDR that came to the iPhone 11 — once again tied to the chip.

You also get a bunch of other benefits of that new image pipeline including expanded dynamic range while shooting video at 4k 30fps, 4k 60 cinematic stabilization and the improved smart HDR while shooting still images. Also brought all 6 lighting effects to the front facing camera in this model.

It’s very like you’re getting iPhone 11 Pro image pipeline attached to a single-camera system — but, and it’s a big but — you don’t get Night Mode. Night Mode is one of the most compelling iPhone camera features in a very long time, so buying the new SE is really a price and size over camera equation.

Lineup Placement

This lineup puts the current iPhones Apple produces at roughly 7 as far as I can tell. Apple will cease selling the iPhone 8 with this release, and will sell the iPhone 8 Plus in certain regions until channel inventory is exhausted. The iPhone XR, XS and XS Max, iPhone 11 and iPhone 11 Pro and this new model. The iPhone SE’s pricing is incredibly attractive at $399 with 64GB of storage with only a $50 bump to $449 for 128GB. The 256GB model runs $549.

If you’re comparing the iPhone XR to the iPhone SE, your only real consideration for the older model would be that you must have the larger screen size. But that seems like a hard sell at $200 more.

Overall, Apple seems to be working hard to mortar over the gaps in its iPhone pricing umbrella, making entry into its ecosystem more attractive. Once in, iPhone users tend to stick for the most part, both because of service-based lock-ins and high customer satisfaction.



from Apple – TechCrunch https://ift.tt/3ckkLzV

Apple introduces new $399 iPhone SE with Touch ID and 4.7″ screen

Apple has dropped a new iPhone SE on the market today. It’s a 4.7” iPhone with a physical home button, Touch ID, a single rear-facing camera and the A13 Bionic chip on board. With a $399 starting price point, the new SE is aimed squarely at new iPhone users or first time smartphone buyers but could appeal to those who want the smallest iPhone model currently available above other considerations.

It comes in black, silver and Product(RED) editions and features a single rear-facing camera and a single front-facing camera. This is Apple’s new entry-level iPhone.

The overall package is pretty appealing here. It’s got the same A13 chip as in the iPhone 11 and iPhone 11 Pro and Apple tells me that the processor performance in the SE is comparable and not toned down for the more affordable unit.

The display is Apple’s Retina HD unit, which is an LCD panel. It is not a Liquid Retina display like the iPhone 11 and iPhone XR. I’m still waiting on specs to see what we’re looking at from a contrast ration perspective here, but it does have True Tone.

Probably the biggest defining feature of the iPhone SE besides its size is its inclusion of a physical home button with Touch ID instead of the Face ID system we’ve come to expect on new iPhones. It’s not clear now whether that’s due to size constraints preventing the inclusion of the needed front-facing True Depth Camera array — but pricing is probably just as likely to figure in this calculation.

Touch ID is reliable and even preferred by some users, though the physical home buttons have long been one of the biggest hardware failure points of iPhones with the feature. In our new mask-using world, though, some ground swell of Touch ID enthusiasm has been gaining. It’s hard to make Face ID systems properly recognize you behind a cloth wrap covering half of your face. This has been an issue for a while in Asia, where mask wearing has long been a matter of courtesy during allergy season or when a person is ill.

Camera and Comparisons

A couple of main items make Apple’s claim that the iPhone SE is ‘the best single-camera system’ supportable. You may recall that the iPhone XR also supported portrait mode and had the same resolution of rear camera. But with the iPhone SE, you have the A13 bionic, a new ISP and the Neural Engine that have improved things significantly in the machine learning department — allowing for segmentation masks and semantic rendering, two big improvements that make the portrait mode far more effective in recent iPhone models.

Apple only supported 3 lighting effects on the XR — the ones where you didn’t have to strip away the background. Those require more beef in the rendering and separation pipeline so the iPhone SE can do those now. The iPhone SE also has the improved Smart HDR that came to the iPhone 11 — once again tied to the chip.

You also get a bunch of other benefits of that new image pipeline including expanded dynamic range while shooting video at 4k 30fps, 4k 60 cinematic stabilization and the improved smart HDR while shooting still images. Also brought all 6 lighting effects to the front facing camera in this model.

It’s very like you’re getting iPhone 11 Pro image pipeline attached to a single-camera system — but, and it’s a big but — you don’t get Night Mode. Night Mode is one of the most compelling iPhone camera features in a very long time, so buying the new SE is really a price and size over camera equation.

Lineup Placement

This lineup puts the current iPhones Apple produces at roughly 7 as far as I can tell. The iPhone XR, XS and XS Max, iPhone 11 and iPhone 11 Pro and this new model. The iPhone SE’s pricing is incredibly attractive at $399 with 64GB of storage with only a $50 bump to $449 for 128GB.

If you’re comparing the iPhone XR to the iPhone SE, your only real consideration for the older model would be that you must have the larger screen size. But that seems like a hard sell at $200 more.

Overall, Apple seems to be working hard to mortar over the gaps in its iPhone pricing umbrella, making entry into its ecosystem more attractive. Once in, iPhone users tend to stick for the most part, both because of service-based lock-ins and high customer satisfaction.



from iPhone – TechCrunch https://ift.tt/3ckkLzV

Scanwell begins 1,000 person study for at-home antibody test for COVID-19

At-home antibody testing for COVID-19 is the subject of ample debate among the scientific and medical community, with some seeing it as a necessary step in the process of selectively re-opening parts of the economy through verification of individuals with immunity within the community, and others debating the accuracy and efficacy of currently available testing methods. Regardless of which side you’re on, it remains true that further testing is needed, and startup Scanwell has begun a sizeable study for its at-home antibody test, while it continues to work with the FDA on emergency use authorization for the diagnostic.

Scanwell is working with the state of North Carolina and Raleigh-based Wake Forest Baptist Health to distribute 1,000 of its at-home antibody test kits to a random sampling of citizens, funded in part by $100,000 from the state legislature. The sample population, chosen from the patient pool of Wake Forest Baptist Health’s system, and meant to be a statistically representative snapshot of the larger population, will get a finger-prick blood sample collection kit by mail, every month for a full year, in order to hopefully track the virus and immunity over time.

The Scanwell test can only be used for research purposes at this time, since it hasn’t yet received an emergency use authorization by the FDA. The FDA has so far specifically not authorized any at-home tests for COVID-19, including those supported by telemedicine, but it has recently updated its guidance to note that it “sees the public health value in expanding the availability of COVID-19 testing through safe and accurate tests that may include home collection,” and it says it is in the process of actively pursuing the development of tests that fit that profile in partnership with diagnostic companies.

LA-based Scanwell Health, which already provides at-home diagnostics for detecting UTIs, announced its work on securing FDA authorization for use of its at-home serological antibody test last month. The test kits can provide results in as little as 15 minutes once they’re received by diagnostic labs, but questions have been raised about the general accuracy of antibody testing overall regarding COVID-19, and there’s still some debate about the nature and duration of post-infection immunity for people who have contracted and recovered from the virus.

Better understanding immunity and who has recovered are key ingredients in any attempt to gradually relax isolation restrictions, so immunity testing is a core component that. It’s something that will be needed at scale, along with infection testing through existing molecular testing methods, and contact tracing, like the system being put in place through Apple and Google.



from Apple – TechCrunch https://ift.tt/2XBDItP

Digital mapping of coronavirus contacts will have key role in lifting Europe’s lockdown, says Commission

The European Commission has set out a plan for co-ordinating the lifting of regional coronavirus restrictions that includes a role for digital tools — in what the EU executive couches as “a robust system of reporting and contact tracing”. However it has reiterated that such tools must “fully respect data privacy”.

Last week the Commission made a similar call for a common approach to data and apps for fighting the coronavirus, emphasizing the need for technical measures to be taken to ensure that citizens’ rights and freedoms aren’t torched in the scramble for a tech fix.

Today’s toolbox of measures and principles is the next step in its push to coordinate a pan-EU response.

Responsible planning on the ground, wisely balancing the interests of protection of public health with those of the functioning of our societies, needs a solid foundation. That’s why the Commission has drawn up a catalogue of guidelines, criteria and measures that provide a basis for thoughtful action,” said EC president Ursula von der Leyen, commenting on the full roadmap in a statement.

“The strength of Europe lies in its social and economic balance. Together we learn from each other and help our European Union out of this crisis,” she added.

Harmonized data gathering and sharing by public health authorities — “on the spread of the virus, the characteristics of infected and recovered persons and their potential direct contacts” — is another key plank of the plan for lifting coronavirus restrictions on citizens within the 27 Member State bloc.

While ‘anonymized and aggregated’ data from commercial sources — such as telcos and social media platforms — is seen as a potential aid to pandemic modelling and forecasting efforts, per the plan.

“Social media and mobile network operators can offer a wealth of data on mobility, social interactions, as well as voluntary reports of mild disease cases (e.g. via participatory surveillance) and/or indirect early signals of disease spread (e.g. searches/posts on unusual symptoms),” it writes. “Such data, if pooled and used in anonymised, aggregated format in compliance with EU data protection and privacy rules, could contribute to improve the quality of modelling and forecasting for the pandemic at EU level.”

The Commission has been leaning on telcos to hand over fuzzy metadata for coronavirus modelling which it wants done by the EU’s Joint Research Centre. It wrote to 19 mobile operators last week to formalize its request, per Euractiv, which reported yesterday that its aim is to have the data exchange system operational ‘as soon as possible’ — with the hope being it will cover all the EU’s member states.

Other measures included in the wider roadmap are the need for states to expand their coronavirus testing capacity and harmonize tesing methodologies — with the Commission today issuing guidelines to support the development of “safe and reliable testing”.

Steps to support the reopening of internal and external EU borders is another area of focus, with the executive generally urging a gradual and phased lifting of coronavirus restrictions.

On contacts tracing apps specifically, the Commission writes:

“Mobile applications that warn citizens of an increased risk due to contact with a person tested positive for COVID-19 are particularly relevant in the phase of lifting containment measures, when the infection risk grows as more and more people get in contact with each other. As experienced by other countries dealing with the COVID-19 pandemic, these applications can help interrupt infection chains and reduce the risk of further virus transmission. They should thus be an important element in the strategies put in place by Member States, complementing other measures like increased testing capacities.

“The use of such mobile applications should be voluntary for individuals, based on users’ consent and fully respecting European privacy and personal data protection rules. When using tracing apps, users should remain in control of their data. National health authorities should be involved in the design of the system. Tracing close proximity between mobile devices should be allowed only on an anonymous and aggregated basis, without any tracking of citizens, and names of possibly infected persons should not be disclosed to other users. Mobile tracing and warning applications should be subject to demanding transparency requirements, be deactivated as soon as the COVID-19 crisis is over and any remaining data erased.”

“Confidence in these applications and their respect of privacy and data protection are paramount to their success and effectiveness,” it adds.

Earlier this week Apple and Google announced a collaboration around coronavirus contracts tracing — throwing their weight behind a privacy-sensitive decentralized approach to proximity tracking that would see ephemeral IDs processed locally on devices, rather than being continually uploaded and held on a central server.

A similar decentralized infrastructure for Bluetooth-based COVID-19 contacts tracing had already been suggested by a European coalition of privacy and security experts, as we reported last week.

While a separate coalition of European technologists and researchers has been pushing a standardization effort for COVID-19 contacts tracing that they’ve said will support either centralized or decentralized approaches — in the hopes of garnering the broadest possible international backing.

For its part the Commission has urged the use of technologies such as decentralization for COVID-19 contacts tracing to ensure tools align with core EU principles for handling personal data and safeguarding individual privacy, such as data minimization.

However governments in the region are working on a variety of apps and approaches for coronavirus contacts tracing that don’t all look as if they will check a ‘rights respecting’ box…

In a video address last week, Europe’s lead privacy regulator, the EDPS, intervened to call for a “panEuropean model ‘COVID-19 mobile application’, coordinated at EU level” — in light of varied tech efforts by Member States which involve the processing of personal data for a claimed public health purpose.

“The use of temporary broadcast identifiers and bluetooth technology for contact tracing seems to be a useful path to achieve privacy and personal data protection effectively,” said Wojciech Wiewiórowski on Monday week. “Given these divergences, the European Data Protection Supervisor calls for a panEuropean model “COVID-19 mobile application”, coordinated at EU level. Ideally, coordination with the World Health Organisation should also take place, to ensure data protection by design globally from the start.”

The Commission has not gone so far in today’s plan — calling instead for Member States to ensure their own efforts align with the EU’s existing data protection framework.

Though its roadmap is also heavy on talk of the need for “coordination between Member Statesto avoid negative effects” — dubbing it “a matter of common European interest”. But, for now, the Commission has issued a list of recommendations; it’s up to Member States to choose to fall in behind them or not.

With the caveat that EU regulators are watching very carefully how states’ handle citizens’ data.

“Legality, transparency and proportionality are essential for me,” warned Wiewiórowski, ending last week’s intervention on the EU digital response to the coronavirus with a call for “digital solidarity, which should make data working for all people in Europe and especially for the most vulnerable” — and a cry against “the now tarnished and discredited business models of constant surveillance and targeting that have so damaged trust in the digital society”.



from Apple – TechCrunch https://ift.tt/2yVq0aS

Tuesday, 14 April 2020

Apple opens access to mobility data, offering insight into how COVID-19 is changing cities

Apple is providing a dataset derived from aggregated, anonymized information taken from users of its Maps navigational app, the company announced today. The data is collected as a set of “Mobility Trends Reports,” which are updated daily and which provide a look at the change in the number of routing requests made within the Maps app, which is the default routing app on iPhones, for three modes of transportation including driving, walking and transit.

Apple is quick to note that this information isn’t tied to any individuals, since Maps does not associate any mobility data with a user’s Apple ID, nor does it maintain any history of where people have been. In fact, Apple notes that all data collected by maps, including search terms and specific routing, is only ever tied to random rotating identifying numbers that are reset on a rolling basis. This anonymized, aggregated data is collected only to provide a city, country or region-level view, representing the change over time in the number of pedestrians, drivers and transit-takers in an area based on the number of times they open the app and ask for directions.

As far as signals go for measuring the decrease in outdoor activity in a given city, this is a pretty good one considering Apple’s install base and the fact that most users probably don’t bother installing or using a third-party app like Google Maps for their daily commuting or transportation needs.

The data is available to all directly from Apple’s website, and can be downloaded in a broadly compatible CSV format. You can also use the web-based version to search a particular location and see the overall trend for that area.

For an individual, this is more or less a curiosity, but the release f this info could be very useful for municipal, state and federal policy makers looking to study the impact of COVID-19, as well as the effect of strategies put in place to mitigate its spread, including social distancing, shelter-in-place and quarantining measures.

Apple has also announced that it’s working with Google on a new system-level, anonymized contact tracing system that both companies will first release as APIs for use by developers, before making them native built-in features that are supplemented by public health agency applications and guidance. Apple seems particularly eager to do what it can to assist with the ongoing COVID-19 crisis, while still striving to ensure that these measures respect the privacy of their individual users. That’s a hard balance to strike in terms of taking effective action at a population level, but Apple’s reach is a powerful potential advantage to any tools it provides.



from iPhone – TechCrunch https://ift.tt/2VnQOs0