cross-posted from: https://sh.itjust.works/post/57126527
How to use data poisoning to trick the algorithm that’s profiling you (and why “personalization” is more fragile than you think)
Note: For education and defensive awareness only. I’m explaining the concept of data poisoning so teams can recognize risks and build safer systems. I’m not encouraging or providing guidance for misuse. :)
If you’re being tracked, scored, and predicted from your clicks… this is how the machine actually works (and how it breaks).
If a retailer can guess you’re pregnant before your family knows… imagine what ad platforms and recommendation feeds can infer about your money, your health, and your next life move from boring little signals you barely notice.
I’m Addie. I’ve spent 15 years in cybersecurity, and I teach cyber threats before they blindside you. In this vid, I break down the real mechanics behind prediction engines, why “scale” doesn’t protect models from manipulation, and how tiny amounts of poison in training data (or your own behavior) can make these systems confidently wrong.
Here’s what you’ll be able to do after this:
Understand how behavioral profiling and predictive analytics pull “private truths” from normal shopping and scrolling Spot how personalized ads and recommendation systems build a story about you from clicks, watch time, and purchases Learn what data poisoning means (in plain English) and why it works at web scale See how an AI backdoor attack can hide in massive training sets without “breaking” accuracy Recognize why adtech and real time bidding are fragile when signals get polluted by bots and noise Understand model collapse and what happens when AI training data becomes AI-generated sludgeStart testing feedback loops safely so you can build hacking instincts without doing anything reckless Sources:
Could you just stop reposting this fucking useless video? 🙄
The next video is how to keep flies from landing in your mouth, it’s going to be a banger because I guess Youtube people have a real problem with it.



