It won’t work alone for long in the age of AI. You won’t be tracked and identified by face alone. It’ll be a complex array of data points. Your face, your hair, your eye color if the cameras have the resolution, your height, your gait, your posture, your scars and injuries, your visible birth defects, whether you use mobility aids, the wireless devices emitting signals in your pockets, the list goes on and on. They’ll assemble dozens of data points and make it extremely difficult to falsify enough to avoid detection instead of just getting flagged as suspicious.
Even if you go full privacy focused with a locked down phone and a dumb car flock cameras are tracking your plate, scanners are tracking the tire sensors in your wheels and phones need IMEI tower connections so all that is logged.
Will they actually devote the resources to try to pierce the anonymity of those handful of people? Everything we’ve seen about how tech companies operate is that they reach a threshold of “good enough for most cases” and don’t bother trying to optimize the edge cases. Collecting the billions of data points to try to use dozens of analysis techniques, and then having some kind of meta analysis on how to resolve disagreements between models, would be resource intensive beyond their own profit motives.
Someone who wants to defeat gait analysis with a different pair of shoes (heel height and sole thickness and back support affect how people walk), and wears a mask might lose the arms race if the tech companies choose to continue to improve the tech even after it’s already good enough.
I think it’s possible but not inevitable. Especially if there’s a financial reckoning for AI companies soon.
Will they actually devote the resources to try to pierce the anonymity of those handful of people?
Given how rapidly they’ve been trying to tighten their grip lately? Yes. They want to know everything about everyone all the time. And they don’t give two shits if we consent to it.
Even if businesses are willing to settle for good enough, governments most certainly will NOT. Those attempting to evade detection will be those they’re most interested in identifying, which is why I mentioned that failure to successfully falsify will get you flagged as having attempted it and probably how. From a government’s perspective, the ones attempting to evade detection are the ones most likely to be criminals or, even worse in their eyes, rebels. Governments, especially authoritarian ones, will make sure the tech constantly pushes the boundaries of what’s possible, or at the very least defeats the vast majority of known evasion techniques.
Then, if business really has left the evaders unidentified, they’ll start adopting the tech from government. Better data with no R&D? Why wouldn’t they at that point? Governments might even subsidize it because it helps them spread the greater surveillance network.
Once we have a case where AI fabricated evidence is used to convict somebody it will also get much easier to dismiss a lot of this data. I still get ads for dogfood for a dog I don’t own.
I know corporations can handsomely reward unscrupulous scientists but more often than not they do not effectively use the data they’re amassing.
That won’t stop corporations and governments from surveiling. They’ll still collect highly accurate information about you. They may not trust public data, but they’ll still trust the systems they use to surveil. They’ll still be right.
The clothing they wear solves most of those. For the physical side (gait, height, etc) it’s a little harder but shoe’s have an effect on most of those (i.e. the round bottom shoes meant to help people work out just by walking wildly change a normal gait and posture).
For the devices though it could get fun. You could have a device mounted in the helmet that will pretend to be people you’ve passed, essentially just replaying the beacons (SSID broadcasts, etc) for the sake of a digital camouflage.
For now. This cat and mouse game will continue on and on. We’ll develop evasion techniques, they’ll learn how to recognize and see through them. We’ll develop new ones again, they’ll learn them again. What about if you speak near a camera? It’ll learn to analyze voice and diction. Voice scrambler? AI is learning to descramble video, can probably learn it for voice, too. Your clothing style will become a data point and an expensive one to consistently falsify. The locations you’re seen at is suggestive. If you walk a dog, good fucking luck convincing it to help you falsify data for the AI monitors.
Still, this is predicated on the assumption that you can recognize and falsify enough of the data points. My point is that they will collect however many data points it takes to make it nigh impossible to get a failure to identify you or a false positive. And if it’s a false positive, we have to question the ethics of pinning your trail on some other random dude.
I agree for the most part, but if we are all walking around in jumpsuits and helmets (Daft Punk style) and repeating the digital beacons of everyone else it seems like false positives are a skill issue for AI. Not too long ago I was watching a video about a guy that was a 100% match in the eyes of AI as someone that was trespassed by the casino. When the cops showed up and he presented his documents, the cops brought him to the station as they thought he must have given false ID when he was originally trespassed. He was eventually able to prove his innocence but the fact he was taken into custody because AI messed up makes me have no issue with people doing stuff to intentionally poison the data.
None of this matters in the present context though because just by wearing that you would be easily identifiable.
You can’t perfectly mimic everyone else and accomplish something unique at the same time, even if it’s something as simple as pulling up a webpage nobody else around is requesting. Your device must in some way identify itself to the network so it can actually receive everything they request, and that’s an avenue for identification and tracking.
Not too long ago I was watching a video about a guy that was a 100% match in the eyes of AI as someone that was trespassed by the casino. When the cops showed up and he presented his documents, the cops brought him to the station as they thought he must have given false ID when he was originally trespassed.
Sure, modern AI can’t push the limits like I’m talking about, but I’m not talking about doing all this with modern AI as it is now, and things are advancing extremely rapidly. Processing power available is, too, as companies churn out as many new data centers as they can. It might not be as long as we hope before the things I suggest become feasible.
He was eventually able to prove his innocence but the fact he was taken into custody because AI messed up makes me have no issue with people doing stuff to intentionally poison the data.
Yeah, modern AI is trained unethically at just about every step of the process, so poison away.
Hope so. Personally I’m pulling for a modular jumpsuit as well. Most of it will come as one piece, so no more worrying about pants and tops as separate items which have to be coordinated and can get lost. Socks/gloves/masks etc would be optional but will just snap onto the main bodysuit piece. Everything will be interchangeable, standardized and spares can be purchased anywhere. Different weights for different climates etc, but it’s all one platform, like the AR-15 of clothing.
Please vote for me in the upcoming primary. If I win I’m sending everyone a free suit. ~(Which you’ll need, because they’ll be mandatory)~ VOTE FOR ME
I think in the future everyone will wear masks like daft punk
Which is already illegal in much of the US. Will it hold up in court? Maybe not but that doesn’t stop cops from demanding you take it off
Makeup isn’t though: we all just need to get down with the clown.
Brb, investing heavily in National Beverage Corp (FIZZ).
It won’t work alone for long in the age of AI. You won’t be tracked and identified by face alone. It’ll be a complex array of data points. Your face, your hair, your eye color if the cameras have the resolution, your height, your gait, your posture, your scars and injuries, your visible birth defects, whether you use mobility aids, the wireless devices emitting signals in your pockets, the list goes on and on. They’ll assemble dozens of data points and make it extremely difficult to falsify enough to avoid detection instead of just getting flagged as suspicious.
Even if you go full privacy focused with a locked down phone and a dumb car flock cameras are tracking your plate, scanners are tracking the tire sensors in your wheels and phones need IMEI tower connections so all that is logged.
Will they actually devote the resources to try to pierce the anonymity of those handful of people? Everything we’ve seen about how tech companies operate is that they reach a threshold of “good enough for most cases” and don’t bother trying to optimize the edge cases. Collecting the billions of data points to try to use dozens of analysis techniques, and then having some kind of meta analysis on how to resolve disagreements between models, would be resource intensive beyond their own profit motives.
Someone who wants to defeat gait analysis with a different pair of shoes (heel height and sole thickness and back support affect how people walk), and wears a mask might lose the arms race if the tech companies choose to continue to improve the tech even after it’s already good enough.
I think it’s possible but not inevitable. Especially if there’s a financial reckoning for AI companies soon.
Given how rapidly they’ve been trying to tighten their grip lately? Yes. They want to know everything about everyone all the time. And they don’t give two shits if we consent to it.
Even if businesses are willing to settle for good enough, governments most certainly will NOT. Those attempting to evade detection will be those they’re most interested in identifying, which is why I mentioned that failure to successfully falsify will get you flagged as having attempted it and probably how. From a government’s perspective, the ones attempting to evade detection are the ones most likely to be criminals or, even worse in their eyes, rebels. Governments, especially authoritarian ones, will make sure the tech constantly pushes the boundaries of what’s possible, or at the very least defeats the vast majority of known evasion techniques.
Then, if business really has left the evaders unidentified, they’ll start adopting the tech from government. Better data with no R&D? Why wouldn’t they at that point? Governments might even subsidize it because it helps them spread the greater surveillance network.
Well they are building data centers like they want to have enough resources to do just that.
Once we have a case where AI fabricated evidence is used to convict somebody it will also get much easier to dismiss a lot of this data. I still get ads for dogfood for a dog I don’t own.
I know corporations can handsomely reward unscrupulous scientists but more often than not they do not effectively use the data they’re amassing.
That won’t stop corporations and governments from surveiling. They’ll still collect highly accurate information about you. They may not trust public data, but they’ll still trust the systems they use to surveil. They’ll still be right.
The clothing they wear solves most of those. For the physical side (gait, height, etc) it’s a little harder but shoe’s have an effect on most of those (i.e. the round bottom shoes meant to help people work out just by walking wildly change a normal gait and posture).
For the devices though it could get fun. You could have a device mounted in the helmet that will pretend to be people you’ve passed, essentially just replaying the beacons (SSID broadcasts, etc) for the sake of a digital camouflage.
For now. This cat and mouse game will continue on and on. We’ll develop evasion techniques, they’ll learn how to recognize and see through them. We’ll develop new ones again, they’ll learn them again. What about if you speak near a camera? It’ll learn to analyze voice and diction. Voice scrambler? AI is learning to descramble video, can probably learn it for voice, too. Your clothing style will become a data point and an expensive one to consistently falsify. The locations you’re seen at is suggestive. If you walk a dog, good fucking luck convincing it to help you falsify data for the AI monitors.
Still, this is predicated on the assumption that you can recognize and falsify enough of the data points. My point is that they will collect however many data points it takes to make it nigh impossible to get a failure to identify you or a false positive. And if it’s a false positive, we have to question the ethics of pinning your trail on some other random dude.
I agree for the most part, but if we are all walking around in jumpsuits and helmets (Daft Punk style) and repeating the digital beacons of everyone else it seems like false positives are a skill issue for AI. Not too long ago I was watching a video about a guy that was a 100% match in the eyes of AI as someone that was trespassed by the casino. When the cops showed up and he presented his documents, the cops brought him to the station as they thought he must have given false ID when he was originally trespassed. He was eventually able to prove his innocence but the fact he was taken into custody because AI messed up makes me have no issue with people doing stuff to intentionally poison the data.
None of this matters in the present context though because just by wearing that you would be easily identifiable.
You can’t perfectly mimic everyone else and accomplish something unique at the same time, even if it’s something as simple as pulling up a webpage nobody else around is requesting. Your device must in some way identify itself to the network so it can actually receive everything they request, and that’s an avenue for identification and tracking.
Sure, modern AI can’t push the limits like I’m talking about, but I’m not talking about doing all this with modern AI as it is now, and things are advancing extremely rapidly. Processing power available is, too, as companies churn out as many new data centers as they can. It might not be as long as we hope before the things I suggest become feasible.
Yeah, modern AI is trained unethically at just about every step of the process, so poison away.
That would at least be fun, if not a bit uncomfortable long term.
Hope so. Personally I’m pulling for a modular jumpsuit as well. Most of it will come as one piece, so no more worrying about pants and tops as separate items which have to be coordinated and can get lost. Socks/gloves/masks etc would be optional but will just snap onto the main bodysuit piece. Everything will be interchangeable, standardized and spares can be purchased anywhere. Different weights for different climates etc, but it’s all one platform, like the AR-15 of clothing.
Please vote for me in the upcoming primary. If I win I’m sending everyone a free suit. ~(Which you’ll need, because they’ll be mandatory)~ VOTE FOR ME