TL;DR: I get a "confident in his ignorance" vibe from your post and don't feel like defending what is widely agreed upon among those who are actually knowledgable. I've cited some sources and you can take up any issues with them. Enjoy having the last word because I don't have any more time for this.
33 bits of uniquely-identify information is all you need and, back in 2010, they successfully managed to match NetFlix users with their accounts on other sites purely based on similarities in their ratings.
I could cite more, but I've got better things to do with my time.
By probability instincts, I mean that, a woodpecker's intutive sense of "What are the odds that I'll find a bug if I keep trying this tree vs. moving on to a new one" is much more in line with actual reality than our intuitive sense of how probability works. (See, for example, Gambler's fallacy and Birthday problem.)
I'd give a link for the woodpecker part, but I read it in print and I can't remember whether it was in Mean Genes, The Third Chimpanzee, or one of my other heavily-cited books by professors.
*facepalm* I'm honestly not sure how to begin here, given what a fundamental misunderstanding of my argument it represents.
advanced search algorithms and time: With nothing more than the most basic of indexing and search that you'd be a moron not to have in your big data setup, I could easily grab data I'd hate to see used by people who just want to muddy the waters and introduce reasonable doubt and I don't consider myself uncommonly smart or knowledgeable.
time and security clearance: Or they might just need to be in the right place to exploit a hole in the system, as Edward Snowden was when he took advantage of Hawaii being exempt from the NSA's anti-leaker security because their Internet situation compared to the continental U.S. kept triggering lockouts.
I'd rather not take the risk.
I'm having trouble coaxing my browser history to cough up the most authoritative URLs I read regarding the "correlating for advertising" part, but here are some relevant ones I did manage to dig back up:
* How many mobile phone accounts will be hijacked this summer? (and his earlier post)
* SMSRouter README (See "DISCLAIMER" and "SMS / SMPP limitations")
1. Having my own cloud is orthogonal to whether I divide my services up between different providers. (eg. I have accounts at more than one VPS provider)
2. There are cases like Google Talk, which don't federate. I can't self-host those.
3. Some things, like hosting my code on sites like GitHub/BitBucket/GitLab/etc., are specifically about not self-hosting because I want to minimize the chance of downtime after my death.
Originally posted by starshipeleven
View Post
I could cite more, but I've got better things to do with my time.
Originally posted by starshipeleven
View Post
I'd give a link for the woodpecker part, but I read it in print and I can't remember whether it was in Mean Genes, The Third Chimpanzee, or one of my other heavily-cited books by professors.
Originally posted by starshipeleven
View Post
Originally posted by starshipeleven
View Post
time and security clearance: Or they might just need to be in the right place to exploit a hole in the system, as Edward Snowden was when he took advantage of Hawaii being exempt from the NSA's anti-leaker security because their Internet situation compared to the continental U.S. kept triggering lockouts.
I'd rather not take the risk.
Originally posted by starshipeleven
View Post
* How many mobile phone accounts will be hijacked this summer? (and his earlier post)
* SMSRouter README (See "DISCLAIMER" and "SMS / SMPP limitations")
Originally posted by starshipeleven
View Post
2. There are cases like Google Talk, which don't federate. I can't self-host those.
3. Some things, like hosting my code on sites like GitHub/BitBucket/GitLab/etc., are specifically about not self-hosting because I want to minimize the chance of downtime after my death.
Comment