搜尋此網誌

2020年4月17日星期五

'非常嚇人的主意'':谷歌和蘋果公司的中共病毒追踪如何將生病的人變成被社會遺棄的賤民
'A very scary idea': How Google and Apple's COVID-19 tracing could turn sick people into pariahs
April  16, 2020


Apple (APPL) and Alphabet’s Google (GOOGGOOGL) say privacy and security are top of mind for their collaboration to enable “contact tracing” apps to function across iOS and Android mobile devices to contain the novel coronavirus. However, telecom and cybersecurity experts say the technology may not fully protect users, raising scenarios that expose the identities of infected individuals.
The planned technology is slated to use low-power Bluetooth functionality standard on iPhone and Android mobile operating systems. The companies propose that as long as a mobile phone’s Bluetooth remains turned on and enabled, a “beacon” can exchange information about COVID-positive disease status between phones within 10 to 15 feet of each other.
The privacy questions lie in the difficulty of setting an effective tracking distance, and time frame between when a device detects close proximity with an infected user’s device, and when the receiving device is notified of the proximity event, according to Ben Levitan, a veteran telecommunications engineer.
“The only way to fix this is to come up with a notification area that's fairly large,” Levitan said, explaining the so-called fix could render the tracking ineffective. “If you give me notifications that someone is within 10 feet, they have The Scarlet Letter. They may have lost their privacy, but it's very valuable to me. Though if you set the parameter to a mile, it's kind of useless to everybody.” 

Here’s how it works

The technology requires two levels of opt-in cooperation: first, from users who voluntarily enter their positive diagnosis into the app and agree to have their status anonymously “beaconed” by their health authority’s app, and second, from app users who agree to receive notifications if their mobile device comes into close proximity with one containing a positive COVID-19 indication. 

An Apple spokesperson explained that mobile devices can watch for these “proximity events,” then allow apps to notify users whose devices have come into contact with devices indicating users tested positive for COVID-19. Notifications are generated for proximity events that occurred within the past 14 days. While the information doesn’t allow users to avoid coming into contact with COVID-positive users, and therefore protect themselves from a high-risk contact, it allows users who find out after the fact to seek appropriate treatment, testing, and personal quarantine if they’ve encountered a device whose user indicates a COVID-19 positive status.
During a press call on Monday, Google and Apple representatives said data is stored locally on the user’s device, requiring no collection or sharing of location or personally identifying information. Beacon information is uploaded to a local server and saved for 14 days, where it can be discovered by devices running a participating app.
Each app user is assigned a unique identifying number known only to their device, according to cybersecurity expert and adjunct industry professor for Information Technology and Management at Illinois Institute of Technology, Louis J. McHugh IV. When the app is activated, the ID number is encrypted using one-way, three-stage encryption.
“According to what the framework says, everything is going through a central database of keeping track of all the people we meet,” McHugh said.

Countless scenarios where a user’s identity becomes obvious

While the tech giants say privacy remains top of mind, the app architecture, as described, may not be capable of achieving total anonymity.
As one example, imagine an app user takes a 30-minute walk and sees only one other person. If the user is notified of a proximity event before contacting anyone else, and had not come into Bluetooth range contact with others for the past two weeks, the COVID-19-positive individual’s identity may be revealed. The scenario is just one of countless possibilities where a COVID-19 positive user’s identity could be extrapolated.
“It’s a very scary idea,” McHugh said, explaining that how easy or difficult it becomes to identify a COVID-positive person’s identity will boil down to how the application is developed — whether it’s solely backward looking versus actively monitoring. “Once I have that Scarlet Letter, per se, when I'm walking around the park walking my dog, am I notifying everybody when I'm walking?”
Neither company addressed how immediately an app user is notified once a proximity event occurs.

Hurdles to make the technology even work

Levitan, who spent 30 years developing worldwide cellphone networks for Yahoo Finance’s parent company Verizon, Sprint, and others, said the level of cooperation required is unlikely to be achieved. He points out layers of factors working against adoption of the technology.
In addition to the smaller hurdle of keeping Bluetooth constantly activated and mobile phones powered on, there are emotional hurdles, too, he said.


“I think if I had COVID-19 and I wanted to go out, just to go get some groceries — even walking on the beach — all of a sudden people's phones would be blowing up and they'd be running away from me and I’m going to be sick of being a pariah and I’d turn off my phone,” he said.
“And unless you make it a crime to turn off your phone, or a crime not to report it, like we do [with] sex offenders — and you are shunned by your neighbors because you're on a database,” he said, it may be difficult to get cooperation needed for the new software to aid in reduced transmission. 
“It is pretty draconian.”
At a minimum, for the technology to work, McHugh said it would need to collect information about date and time. “Who owns that 14 days of data?” he asked.
The fact that the technology is not using location data, McHugh said, is positive. However, he cautioned, Google and Apple have not explained what personal information — like names, addresses, phone numbers — will be collected, and whether the requirements would be set or retained by the developers and agencies that run the apps.
“I think the big deal is once we get the app, and once we get our personal information in the app, that's the deal. That's where the shoe meets the road,” McHugh said.
Moreover, McHugh said Bluetooth is not the most ideal technology to have constantly activated on a mobile device, which is required to make the tracking software effective.
“That's why [Google and Apple] had to bring encryption to the party,” McHugh said. “If you saw the list of attacks when you're vulnerable to Bluetooth, you would never use it again. There's multitudes of attack factors with a Bluetooth surface because Bluetooth, unfortunately is inherently insecure.” A Bluetooth breach can compromise files, photos, call logs, contacts and most data stored on the device, with the exception of encrypted information, he said.
Asked whether data on the system’s servers can be breached, Apple and Google acknowledged that attacks can happen despite best efforts. They also emphasized that the decentralized nature of the storage — data stored locally on individual devices and among multiple servers — would discourage bad actors as infiltrating such information would be difficult and expensive. 

No precedent in the U.S.

During Phase 1 of the rollout, Apple and Google plan to provide operating system updates on iOS and Android devices that will provide the functionality app developers need to create the apps. Users would need to then initiate a download to use the app offered by their regional health agency.
Under a Phase 2, the companies plan another operating system update that will pre-install a built-in user interface, allowing iPhone and Android device users to get started before they’ve installed the app. The rationale for pre-installation, they said, was to reduce barriers in order to get more people to use them. Once developed, the apps will remain under the authority of regional public health agencies.


Even without formalized criminal or civil penalties, software like this in the U.S. has never been implemented and therefore no precedent exists for how information could be used against its users. Most U.S. states have criminal laws that make intentional or reckless transmission of communicable diseases, such as sexually transmitted diseases, punishable by imprisonment. Federal statutes also provide for imprisonment for violation of quarantine orders.
Embedding the software in mobile device operating systems could be a “slippery slope,” McHugh said. “Are you going to take away my functionality to turn off my Bluetooth, because I know myself when I’m not using Bluetooth I turn it off,” he said.
McHugh’s colleague, Illinois Institute of Technology professor Jeremy Hajek, said that because Apple devices represent nearly 20% of worldwide mobile devices and Google’s Android devices represent around 80%, the tracking if pre-installed would be ready to go on nearly 100% of the world’s devices.
“‘I think there are people in Apple, Google who really really want to help, and they say, ‘Hey we have this massive amount of power — the CIA couldn’t do this, the Army couldn't do this, but we could,’” Hajek said. Then, he said, the question becomes once a technology like this is implemented, whether it is gradually expanded into other areas.
“And you think this system gets shut down the day after COVID-19 is declared done?” Levitan asked.
“I appreciate the work that two competing companies are doing in trying times for, let's be frank, humanity,” McHugh said. “We’ve just got to be coherent of the privacy and security risks as a whole, and that it not just be a rush to bring this to market.”

https://finance.yahoo.com/news/how-google-and-apples-new-coronavirus-tracing-could-turn-people-into-pariahs-201733674.html










沒有留言: