For the Pixel 4, the Pixel 4 of the XL will be launched with a system of facial recognition that promises to be pretty accurate. But his sophistication as a whole, can be regarded as the basis of a practice is very questionable whether a company that was contracted by Google to have used, without consent, images of people living on the streets, university students, and in dark-skinned people to train on the technology.
- Google announces a so-anonymous for Maps, and other privacy features
- How to disconnect my Google account from another device?
It is important that a company has a database of images, extensive, and diverse, in order to prevent their technology of facial recognition to fail, so as to appear that there has been discrimination, for example.
But it is also important that the procedures for the collection of images that adhere to moral and ethical principles. It is here that Google may have missed out on, though with no intention of a finding made by the New York Daily News points out that the Possibilities, the agency has been hired to record them, I would not have informed the participants about the purpose of the procedure.
The report also states that the Randstad has hired temporary staff to address the residents of a street in Atlanta, georgia, students at a number of universities that were participating in a film festival in Los Angeles, california, among others, in order to find volunteers willing to take part in a survey.
So far, nothing out of the ordinary. However, several of the workers told the New York Daily News, who were asked to address particularly to people with darker skin, and don’t tell the participants that their faces would be recorded.
To convince people not included on the agenda, the participants had to use arguments like “play with the phone for a few minutes, and you get a gift card” and “test this app and get$ 5”. There have also been those who have been instructed to tell you that it is the goal of the approach is to test an app for selfies that is similar to Snapchat.
The most important thing was not to reveal the face of the participant was recorded as well. If any one of them I am wary, it was necessary to deny that the recording was being made, as the case may be, to speed up the conversation before the person has refused to take part.
The workers have also been instructed to approach homeless people, because they are the least likely to say anything to the media.” As for the university, it also would have been no incentive to address them because, as they usually have a very tight budget and tend to bump more easily to participation in paid online surveys.
In July, Google confirmed to The Verge that it conducts “field research” to scan the face in order to train the facial recognition technology. The company also says it faces scanned to get an identity in the abstract, and not tied to your e-mail it to the participants to inform at the time of the approach.
According to Google, the data of the facial are held for up to 18 months, but any member may request that your information be deleted prior to that.
The problem here is that if people are not aware that their faces were being scanned — a lot or they would have realized that there was an end-of-consent — for example, they may request the deletion of your data?
For the time being, Google says it’s taking seriously the complaints about the Possibilities and what will follow it. The company went on to explain that the approaches, which, if confirmed, be a breach of its terms and conditions for research volunteers.
It is not clear, however, what steps the company will take if the deficiencies are proven.