It is safe to say that a good many of us create digital personalities that are representative of our real selves. Sure we may embellish. We may highlight the good and limit the bad (at the Propared offices, Eric thinks he is much funnier than he is) And there are certainly those who use the veil of social media to hide who they really are. But in general, we are pretty recognizable. We read stories that interest us, we comment when we are inspired, we "like" and "retweet" relevant and interesting content.
So even if we were misrepresenting ourselves to each other, we are certainly exposed to the companies whose services we use - Facebook, Twitter, LinkedIn, etc. They see everything. More specifically, and to the point of today's blog they are developing mechanisms that allow them to intuit everything as well.
Let me explain. Recently, Facebook revealed that it had manipulated hundreds of thousands of users' news feeds in order to determine whether the content users had access to would affect their emotional states, and ultimately, the quality of those users' future posts. Translation - does content that is highly skewed negative make you post more negatively? Vice versa? The answer, according to Facebook data scientists, is a resounding YES. Even third-parties are building tools to map user emotional response using "keywords" designed to capture the reactions around particular events.
Emotional contagion is nothing new. It has been documented in countless lab experiments but the idea that our emotional states can be manipulated through our social media profiles definitely raises new questions, especially if the only consent we have given is buried in the dregs of a Data Use and Privacy Policy.
What if that information is shared with other companies? How would they use such data? The first thing that comes to mind is target marketing. But open up a browser window today and we already see ads, offers, banners, panels, and who knows what else enticing us to engage with products. These are ads carefully selected to connect with us based on our shared social information. But these ads are generated based on our interests, buying history, “likes,” “shares,” websites frequented, and other hard data. Though that information may have an emotional component for us (for example, going online to buy a birthday gift for a family member), the ads generated were done so after the fact. This new research suggests companies would be able to influence our behavior before we even go in search of fulfilling such needs. This ventures into dangerous territory – consumer protection, informed consent…
Our CEO, Ryan Kirk, wrote a piece last week about Propared's values. One of those, transparency, crops up again here. In the many discussions about the "look" and "feel" of Propared, we ultimately decided to leave the marketing material outside of the program itself. We believe that if it truly is our mission to make work easier, more efficient, and more cost-effective for live event professionals, we must not interfere with their ability to get their work done. But outside of the application, of course we market. We market because we don't know everyone yet. We market because we believe that with your input and feedback, Propared will help you become a more effective manager.
We hope that excites you. We hope that it engages you and inspires a positive reaction to us and what we have to offer. But no, we won't be mapping your emotional state anytime soon.