『12月7-12日』单笔充值满20送5,满100送40,满300送180,满500送360。赠送部分24小时内到账。
首页 / 雜誌文章 / 【中英文学习】人脸识别 无处躲藏

【中英文学习】人脸识别 无处躲藏

中文


人脸识别

无处躲藏

【中英文学习】人脸识别 无处躲藏

人脸识别不只是另一种技术。它将改变社会

人类的脸是一件杰作。面部特征之纷繁各异令人惊叹,它让人们能相互辨认,也是形成复杂社会群体的关键。人脸传递情感信号的功能也同样重要,无论是通过下意识的脸红还是有技巧的假笑。人们在清醒时花费大量时光研读一张张面孔——在办公室,在法庭,在酒吧,在卧室,寻找着兴趣、敌意、信任和欺骗的迹象。他们也花大把的时间试图掩饰自己的神色。

科技正迅速赶上人类研读脸孔的能力。在美国,教堂使用人脸识别来追踪教徒做礼拜的出席情况;在英国,零售商用它来辨认有扒窃前科的顾客。今年,威尔士警方利用人脸识别在足球场外逮捕了一名嫌疑犯。在中国,人脸识别被用于验证网约车司机的身份、让游客刷脸进景点、让顾客微微一笑就能刷脸买单。苹果的新款iPhone预计将用这一技术来解锁屏幕。

与人类的技能相比,这样的应用看似只是锦上添花。飞行或互联网这样的重大突破明显改变了人类的能力,而人脸识别似乎只是对面孔进行编码。尽管人的面孔为个人独有,但也是公开的,因此乍看起来,技术并没有侵犯隐私之嫌。但是,低成本、快速、大量地记录、存储和分析人脸图像的能力终有一天会使隐私、公平和信任等观念发生根本性的改变。

终极战线

先说隐私。人脸相比指纹等其他生物特征数据的一个巨大区别就是它们能够远距离起作用。人们只要有手机就可以拍下照片,供人脸识别程序使用。俄罗斯的一款应用FindFace抓拍陌生人的照片与社交网络VKontakte上的照片比对,识别人的准确率达70%。Facebook的面部图片库不能被其他人提取,但是,举个例子,这家硅谷巨头可以获得汽车展厅内到访者的照片,然后使用人脸识别技术在自己的网站上找到这些人,向他们发送汽车广告。即使私人公司无法将照片和身份联系起来,国家往往可以做到。中国政府有公民的面部记录;美国半数成年人口的照片储存在数据库中,可供FBI使用。如今,执法机关在追踪罪犯方面拥有了一个强大的武器,但它可能会令公民隐私遭受巨大的损害。

人脸不仅仅能表明身份,它还显示了许多其他信息,同样能由机器读取。这同样带来了一些益处。一些公司正通过分析脸部特征来自动诊断罕见遗传疾病,比如Hajdu-Cheney综合症【译注:颅骨发育不良伴肢端溶骨症】,和其他可能的手段相比,早早就发现了病情。测量情绪的系统也许能让自闭症患者更好地理解对他们来说难以捉摸的社交信号。但这项技术也造成了威胁。斯坦福大学的研究人员已经证明,面对一个男同性恋者和一个异性恋者的照片时,算法识别他们性取向的准确率可以达到81%。人类只能达到61%。在那些视同性恋为犯罪的国家,一个能从面部推断出性取向的软件让人恐惧。

钥匙,钱包,头套

不那么暴力的歧视也可能变得普遍。雇主本来就可能会根据自己的偏见来拒绝雇用某个人,而人脸识别也许会让这种偏见成为常态,令公司能够通过种族以及显现智力水平和性取向的特征过滤所有工作申请。夜总会和体育场馆也许会受到压力,可能需要扫描访客的脸来识别暴力威胁,从而保护人们——尽管由于机器学习的性质,所有的人脸识别系统都不可避免地面对概率问题。此外,这类系统可能会对那些非白色皮肤的人有偏见,因为用来训练算法的数据集里大部分是白人面孔,这样的算法不太适用于其他种族。在影响法院保释和量刑决定的自动评估工具中,已经出现过这样的偏见。

最终,持续的面部记录和用计算机数据测量真实世界的小工具可能会改变社交互动的本质。掩饰有助于润滑日常生活的齿轮。如果你的伴侣能发现每一个强压下去的哈欠,你的老板能觉察每一丝恼怒的表情,婚姻和工作关系都会变得更真实,但也更不和谐。社交互动的基础可能也会改变,从基于信任的一系列承诺,变成对风险和回报的算计,这些算计则源自于计算机对人们面部信息的解读。人际关系可能变得更理性,但也变得更像交易。

至少在民主国家,立法可以帮助改变利弊之间的平衡。欧洲监管机构已在即将出台的数据保护法规中嵌入了一套原则,规定包括“脸纹”在内的生物信息属于其所有者,使用这些信息需要征得本人同意。这样,Facebook在欧洲就不能像在美国那样,直接向参观汽车展的人推送广告了。反歧视法律可以适用于筛选求职者照片的雇主。商业人脸识别系统的供应商可能要接受审核,证明它们的系统没有在无意中传播偏见。使用这些技术的公司也应该被问责。

然而这类规定并不能改变发展的方向。随着可穿戴设备的普及,摄像头只会越来越普遍。从太阳镜到化妆,试图欺骗人脸识别系统的种种努力已被挫败。剑桥大学的研究表明,人工智能可以重建伪装之下的面部结构。谷歌已经明确表示不会将面部信息和身份匹配,担心这会被非民主政权滥用。其他的科技公司似乎没那么讲究。亚马逊和微软都在使用它们的云服务来提供人脸识别,这项技术也是Facebook计划的核心。政府不会想放弃自己的利益。改变即将到来。直面它吧。

EN


Facial recognition

Nowhere to hide

Facial recognition is not just another technology. It will change society

【中英文学习】人脸识别 无处躲藏

THE human face is a remarkable piece of work. The astonishing variety of facial features helps people recognise each other and is crucial to the formation of complex societies. So is the face’s ability to send emotional signals, whether through an involuntary blush or the artifice of a false smile. People spend much of their waking lives, in the office and the courtroom as well as the bar and the bedroom, reading faces, for signs of attraction, hostility, trust and deceit. They also spend plenty of time trying to dissimulate.

Technology is rapidly catching up with the human ability to read faces. In America facial recognition is used by churches to track worshippers’ attendance; in Britain, by retailers to spot past shoplifters. This year Welsh police used it to arrest a suspect outside a football game. In China it verifies the identities of ride-hailing drivers, permits tourists to enter attractions and lets people pay for things with a smile. Apple’s new iPhone is expected to use it to unlock the homescreen.

Set against human skills, such applications might seem incremental. Some breakthroughs, such as flight or the internet, obviously transform human abilities; facial recognition seems merely to encode them. Although faces are peculiar to individuals, they are also public, so technology does not, at first sight, intrude on something that is private. And yet the ability to record, store and analyse images of faces cheaply, quickly and on a vast scale promises one day to bring about fundamental changes to notions of privacy, fairness and trust.

The final frontier

Start with privacy. One big difference between faces and other biometric data, such as fingerprints, is that they work at a distance. Anyone with a phone can take a picture for facial-recognition programs to use. FindFace, an app in Russia, compares snaps of strangers with pictures on VKontakte, a social network, and can identify people with a 70% accuracy rate. Facebook’s bank of facial images cannot be scraped by others, but the Silicon Valley giant could obtain pictures of visitors to a car showroom, say, and later use facial recognition to serve them ads for cars. Even if private firms are unable to join the dots between images and identity, the state often can. China’s government keeps a record of its citizens’ faces; photographs of half of America’s adult population are stored in databases that can be used by the FBI. Law-enforcement agencies now have a powerful weapon in their ability to track criminals, but at enormous potential cost to citizens’ privacy.

The face is not just a name-tag. It displays a lot of other information—and machines can read that, too. Again, that promises benefits. Some firms are analysing faces to provide automated diagnoses of rare genetic conditions, such as Hajdu-Cheney syndrome, far earlier than would otherwise be possible. Systems that measure emotion may give autistic people a grasp of social signals they find elusive. But the technology also threatens. Researchers at Stanford University have demonstrated that, when shown pictures of one gay man, and one straight man, the algorithm could attribute their sexuality correctly 81% of the time. Humans managed only 61%. In countries where homosexuality is a crime, software which promises to infer sexuality from a face is an alarming prospect.

Keys, wallet, balaclava

Less violent forms of discrimination could also become common. Employers can already act on their prejudices to deny people a job. But facial recognition could make such bias routine, enabling firms to filter all job applications for ethnicity and signs of intelligence and sexuality. Nightclubs and sports grounds may face pressure to protect people by scanning entrants’ faces for the threat of violence—even though, owing to the nature of machine-learning, all facial-recognition systems inevitably deal in probabilities. Moreover, such systems may be biased against those who do not have white skin, since algorithms trained on data sets of mostly white faces do not work well with different ethnicities. Such biases have cropped up in automated assessments used to inform courts’ decisions about bail and sentencing.

Eventually, continuous facial recording and gadgets that paint computerised data onto the real world might change the texture of social interactions. Dissembling helps grease the wheels of daily life. If your partner can spot every suppressed yawn, and your boss every grimace of irritation, marriages and working relationships will be more truthful, but less harmonious. The basis of social interactions might change, too, from a set of commitments founded on trust to calculations of risk and reward derived from the information a computer attaches to someone’s face. Relationships might become more rational, but also more transactional.

In democracies, at least, legislation can help alter the balance of good and bad outcomes. European regulators have embedded a set of principles in forthcoming data-protection regulation, decreeing that biometric information, which would include “faceprints”, belongs to its owner and that its use requires consent—so that, in Europe, unlike America, Facebook could not just sell ads to those car-showroom visitors. Laws against discrimination can be applied to an employer screening candidates’ images. Suppliers of commercial face-recognition systems might submit to audits, to demonstrate that their systems are not propagating bias unintentionally. Firms that use such technologies should be held accountable.

Such rules cannot alter the direction of travel, however. Cameras will only become more common with the spread of wearable devices. Efforts to bamboozle facial-recognition systems, from sunglasses to make-up, are already being overtaken; research from the University of Cambridge shows that artificial intelligence can reconstruct the facial structures of people in disguise. Google has explicitly turned its back on matching faces to identities, for fear of its misuse by undemocratic regimes. Other tech firms seem less picky. Amazon and Microsoft are both using their cloud services to offer face recognition; it is central to Facebook’s plans. Governments will not want to forgo its benefits. Change is coming. Face up to it.

来源:经济学人2017年9月9日

发表评论

切换注册

登录

忘记密码 ?

您也可以使用第三方帐号快捷登录

Q Q 登 录
切换登录

注册