Background:
Remember when privacy was a thing? No? Oh, ok then.
But people used to have some concern about having their location and whereabouts monitored by companies and governments.
I guess this information was used for evil-doing at least once in history? Whatever, who cares, let’s gather data and make an app!
The issue:
Have you ever been blindsided by an in-retrospect-obvious event, like a firing (or even just a passing-over for a promotion) at work, or a seemingly-sudden breakup?
Proposal:
Using the power of OMNIPRESENT CORPORATE SURVEILLANCE, we can create a new program, which we will call Big Brother 2, that does the following:
- Reads all your email (like most email provides already do).
- Reads all your text messages and any transcribed voicemails.
- Examines your online purchasing habits.
- Checks your location history and that of your friends.
- Checks to see if you are associating with any subversive individuals or organizations.
- Analyzes your photos and categorizes their content.
-
Monitors your mood by reading your posts on social media.
- Optionally listens in to your conversations, if you are in a place where this is legal.
Big Brother 2 will collect this data from thousands or millions of users, and—using advanced and overhyped machine-learning techniques—it will figure out what kinds of warning signs preceded various life events.
Then it can forewarn you of danger in your own life!
Examples:
- Dating (Figure 1): Two people are dating and their messaging steadily becomes less frequent and more negative. Big Brother 2 can extrapolate their breakup date and (optionally) start preemptively saving flattering photos of those users for their upcoming dating profiles.
- Employment (Figure 2): Someone’s boss mentions “outsourcing” and then communication rapidly drops off. Big Brother 2 can recommend some resume-preparation services for that employee.



Conclusion:
Silicon Valley entrepreneurs: hire me to develop this project. Thanks in advance.
PROS: Could reduce the likelihood of snakebite.
CONS: May result in “Logan’s Run”-esque scenarios where the system determines that a person has negative value, and then the user’s phone starts plotting to murder the user (see historical example from Episode #270 of The Simpsons). If this occurs, it is an example of a bad optimization function, and should be fixed in the next update.

You must be logged in to post a comment.