Stanley sounded alarm after consulting Invenda sales brochures that promised "the machines are capable of sending estimated ages and genders" of every person who used the machines without ever requesting consent.
Yup it's for "advertising" say for example the Army wants to know which areas have the most fighting aged men. So posters and recruiters know where to hang out. (this is the most extreme example.)
I saw some posts about a similar technology in the meetings and events industry: a company is selling "facial analysis" not "facial recognition." They try to get around privacy laws by saying "well our technology does scan every single face it sees, but it doesn't store that image, it just determines age, gender, race and emotional sentiment and adjusts tallies for those categories in a database."
It's still information gathering I didn't consent to while attending a conference, and it's a camera with the potential to be hacked.
Of course it's always about marketing and advertising. They want to have a heat map of which areas are popular and at what times. In the case of events so they can sell to sponsors and exhibitors. In this university it's less clear. Do the vending machines have a space to sell ads? That would be my guess.
Because people are dumb. If the machine knows when someone is looking at it, it can stop doing whatever it does to try and get your attention, and put itself in “sales mode”.
Still, you’re right. It seems like an overly complicated and expensive solution. Old-fashioned vending machines did the job just fine.
why do people think it's okay to do this shit? if you're coding facial recognition for a vending machine, that's like 80 steps too far down the capitalism ladder
if you took this machine back to the 1920s and told people what it was doing, they'd shoot at it. and probably you
This is the result of capitalism - corporations (aka the rich selfish assholes running them) will always attempt to do horrible things to earn more money, so long as they can get away with it, and only perhaps pay relatively small fines. The people who did this face no jailtime, face no real consequences - this is what unregulated capitalism brings. Corporations should not have rights or protect the people who run them - the people who run them need to face prison and personal consequences. (edited for spelling and missing word)
In the article is a sound explanation: the machine is activated by detecting a human face looking at the display.
If this face recognition software only decides "face" or "not face" and does not store any data, I'm pretty sure this setup will be compatible with any data protection law.
OTOH they claim that these machines provide statistics about age and gender of customers. So they are obviously recognising more than just "face yes". Still – if the data stored is just a statistics on age and gender and no personalised data, I'm pretty sure it still complies even with 1920s data protection habits.
I'm pretty sure that this would be GDPR conform, too, as long as the customer is informed, e.g. by including this info in the terms of service.
The students should get together and jack the machine away into their hacking club and do some reverse engineering, so that we get more information on how the data collection worked as opposed to just trusting the company's statements. If a hacking group like the German Chaos Computer Club got behind this, they could release their findings while keeping the perpetrators anonymous. However, I’m pretty sure the machine is just a frontend to a server, which got shut down as soon as the students complained, with no GDPR-like checkout being available in the jurisdiction.
Vending machine in the Amazon warehouse in MN just had a camera taking pictures also. Fucking weird ass rich people/corps. A person was behind this decision and that face needs to be woken up with a punch everyday....
No only was a person behind the decision, a person was also behind the dissemination of the requirements, the implementation of the change, the design of the hardware, and all steps in between.
When you start tinkering with a machine learning model of any kind, you’re probably going to find some interesting edge cases the model can’t handle correctly. Maybe there’s a specific face that has an unexpected effect on the device. What if you could find a way to cheese a discount out of it or something?
Imagine a racist vending machine. The face recognition system think this customer is black with 81% confidence. Let's increase the price of grape soda! Oh look, a 32 year old white woman (79% confidence). Better raise the price of diet coke!
I don't think they're doing dynamic pricing on an individual basis, that would be too obvious. But checking the demographics of each location or individuals' shopping habits, and potentially adjusting the prices or offerings? Definitely.
A low end Windows PC can be had very cheap these days. Why bother doing something proprietary, if you can just cobble together something from off the shelf parts?
I'd doubt it's collecting or transmitting much. It's probably just estimating age, sex, race etc. and using it to decide which promotion to put on screen. It's possibly collecting these to determine what type of people use the machine. Similar to those billboards in shopping centres.
Storing each individual to recognize later or identify online seems like a stretch.
If it did have a user bio database, it would be centralised and not on the machine itself.
I think the problem is that it is storing the user faces, at all. If it were simple identifying each person's characteristics there would be no reason to save that data for later. Also, apparently the company advertises that the machine does transmit this data for estimating age and gender for every purchase.
That's your claim though. They are storing "male, 24" and that's it, no face. Of course they could be lying and actually are storing faces, but it doesn't look like it. And it's also perfectly valid to object to them storing even "male, 24".
The Reddit post sparked an investigation from a fourth-year student named River Stanley, who was writing for a university publication called MathNEWS.
Where Cadillac Fairview was ultimately forced to delete the entire database, Stanley wrote that consequences for collecting similarly sensitive facial recognition data without consent for Invenda clients like Mars remain unclear.
Stanley's report ended with a call for students to demand that the university "bar facial recognition vending machines from campus."
Some students claimed on Reddit that they attempted to cover the vending machine cameras while waiting for the school to respond, using gum or Post-it notes.
The technology acts as a motion sensor that detects faces, so the machine knows when to activate the purchasing interface—never taking or storing images of customers."
It was only after closing a $7 million funding round, including deals with Mars and other major clients like Coca-Cola, that Invenda could push for expansive global growth that seemingly vastly expands its smart vending machines' data collection and surveillance opportunities.
The original article contains 806 words, the summary contains 166 words. Saved 79%. I'm a bot and I'm open source!
Yeah but this is the University of Waterloo we're talking about here. This hit Canadian mainstream media CTV News so I know that. Also for an university specializing in Engineering and Mathematics there's a shit ton of cameras around
I think this is what many people don't realize. While there are facial recognition regulations, even stricter data regulations like GDPR do allow this technology. Often it is used for marketing data such as gender or age estimation
A surprisingly large number of devices run windows. The reason for this is a lot of people are familiar with the ecosystem and you can get support and service contracts.
At work we have automatic coffee machines that run windows using a java app. The area around the office has information and advertising boards that run windows.
But I have seen a slight shift towards raspberry pi's using some Linux dektop image.
I’ll play devil’s advocate. The machine recorded estimated age and gender. Assuming it tracked statistics and didn’t store images, what is the real harm? Future candy will have different designs after they found most users were 70yr old grandpas?
It is anonymized PII data collected without explicit consent, sure, but don’t blow it out of proportion. There is no big surveillance state plot here (yet), just an overzealous marketing team.
If you don't have access to the source code then you don't know what it's doing. If there's economic incentives to take my picture and tie my face to my name then I'm going to assume "trust us, it's anonymous" means "we buy and sell your data" (at least).
If you'll grant there are people in power who would want a surveillance state and businesses routinely sell data to governments then you don't get to dismiss this out of hand. We have to draw the line somewhere, even if marketing people with a stalker mentally don't see the line.
Not everybody who approaches the machine or walks past it is really consenting to their appearance being logged and analysed though - not to mention that "we don't store data" is only true if the security is effective and no exploits manage to weaponise the camera now staring back at you as you try to make a purchase.
Ultimately vending machines are completely passive sales anyway, the collection of demographic data about who is buying from the machine are a little useless because it's not like the machine can work on its closing techniques for coin based candy sales.