“What’s that?” I asked, looking at the colorful glass containers my wife was holding.
“Wasp traps,” she said and then – noting the vacant look on my face – went on to explain that you add sugar water, hang them outside and entice wasps that fly in, get caught and die.
That’s an imperfect segue to the recent technology buzz over Facebook and information. (Except for that “die” part.)
There’s some irony about people getting worked up about their information being shared, because we are so bad at it. More than one-in-four people don’t lock their data-packed phones. We install software without checking the Terms of Service agreements and apps without editing what information we share. We store passwords in our browsers and reuse them on multiple sites. And Millennials, who tend to be privacy conscious, are also more likely to allow access to location and information for concrete benefits.
In this case, the information is not about your privacy, information like social security numbers or the name of your first pet. It’s your preferences.
Detailed demographic information is already freely available by ZIP code from the U.S. Census and government agencies (usa.gov/statistics). That also generates religious profiles (thearda.com/DemographicMap).
Go one step further and you have lifestyle segmentation. Companies like Claritas use consumer behavior to generate lifestyle and media traits along with demographics (segmentationsolutions.nielsen.com/mybestsegments). Now you not only know an average family size, but that they are likely to shop at Pottery Barn. Churches can use this type of information to determine more effective outreach strategies or planting churches.
We don’t like to admit it, but knowing your preferences allow savvy data-crunchers to determine other things about us, based on “Big Five” – extraversion, agreeableness, conscientiousness, neuroticism and openness to new experiences to generate psychographic profiling. Demographics are based on general data; this is individualized.
Research from the University of Cambridge found that if you’ve liked 10 Facebook pages, an advertiser can know you as well as a colleague. With 70 likes, a good friend. 150 likes? Your parents. And with 300 Facebook likes, about as well as your spouse.
It’s bad enough that this Facebook snafu allowed your preferences to be shared with advertisers because your friend (not you) took an online survey without your knowledge or permission, and this caused their friends – i.e., you – to allow advertisers have access to your preferences. Congress repealed FCC laws last year so that browsing history can be sold to advertisers without your consent.
Even knowing all this, you may share my initial reaction. What do I care if some advertise knows I like Monty Python? And it’s this: Manipulation. Like those wasps, you’ll see feeds that will attract you, give you false information about their responses and possibly produce false messages you’ll respond to. Reality became fuzzier. Stings, doesn’t it?
Maybe you can’t or don’t want to leave Facebook, and advertisers already have your information. Use tools that show what social networks know about you (tech.co/how-facebook-know-2018-03). Protect your privacy. Take charge of what you see on your feed with MIT Media Lab’s GoBo (gobo.social) or block your feed with feedless.me (iOS). But especially, suggests Forbes writer Dani Di Placido, challenge your perception, and replace blind agreement with understanding.
Ken Satterfield is a former media specialist and current marketing coordinator for Word&Way.
What Churches Can Learn From Facebook Data Controversy (ChurchMag)
Download Your Google and Facebook Information: What You’ll Learn (both from USAToday)
The Truth About Teenagers, the Internet, and Privacy (Fast Company)
How Cambridge Analytica Mined Data for Voter Influence (Psychology Today)
How Facebook Made it Easy to Hack (Our) Minds (Eudaimonia)
How Researchers Learned to Use Facebook ‘Likes’ to Sway Your Thinking (New York Times)
How to Manipulate Facebook and Twitter Instead of Letting Them Manipulate You (MIT Technology Review)