April 4, 2018
By Maven Road
Social networking has changed the way people communicate, do business, get daily news, and it’s how many of us spend most of our free time. Almost a quarter of the world’s population, and nearly 80% of all internet users in America are on Facebook.
There are several benefits on using social networking. Social platforms allow users to communicate with people they care about, learn new things, develop their interests, and get entertained. In terms of professional development, it is also possible to use social media to increase knowledge in a particular field and build a professional network by connecting with other professionals in the same industry. At the company level, social media allows business owners to have a “conversation” with the audience, gain customer feedback, and elevate the brand. All these assets come without any kind of financial retribution from the user.
However, there is undoubtedly a “price” to pay in exchange for all these benefits: the significant privacy risks involved when publishing almost our entire lives on a platform. People get this need to share from the human desire to reveal valuable and entertaining content to others; to define themselves, to grow and nourish relationships, and to get the word out about brands and causes they like or support without fully knowing the risks that this implies.
One example of the implied danger of exposing our privacy on social networks materialized recently within the UK-based data firm Cambridge Analytica. The company acquired a vast amount of Facebook users’ personal information to develop software that may point to possible swing votes in political campaigns; including President Donald Trump’s 2016 election bid.
The data this firm managed was reportedly collected in early 2014 by an application called “thisisyourdigitallife”, in which about 270,000 users agreed to having their data gathered and used for academic research, in exchange for a small payment. However, the app not only collected personal information from the people who used it, but also of their unknowing friends. Using this data the app provided information from over 50 million Facebook users to Cambridge Analytica, which the company then used to create 30 million “psychographic” profiles that could be used to design targeted political ads.
Data is not just inanimate information; behind all the gathered material there are individuals with a psychological profile that precedes them: preferences, inclinations, values and motivations.
Private data collected on social media can be very sensitive, from personal health information, to corporate information. Even though more data can be more insightful and help in decision making; providing data to and obtaining data from other organizations can introduce security, regulatory, and ethical questions, in addition to putting a company’s reputation seriously at risk.
When some personal information is taken and sold or distributed, that analyzed individual is treated as a thing, a means to be used. In the Cambridge-Analytica case, the action of using this private information to structure a behavior manipulation system for private purposes acted as an aggravating circumstance. The question is, how can we avoid this?
Data analysis is a type of business intelligence, applied to make better use of resources and generate profits. In today’s competitive market scenario, businesses use tools like these to create value and a competitive advantage. By using data analytics, organizations can differentiate themselves from the competitors, improve procurement efficiency and save millions when budgeting.
Infusing data analytics into the organization’s architecture can help stakeholders and key decision makers to understand customers’ behavior and identify new and innovative ways to increase sales by identifying trends, patterns and useful information from existing sets of data.
Among all the wonders that precede the data analysis industry, there is also a dark side to this business. Although this is an uncomfortable topic for companies that develop this practice, after the Facebook-Cambridge Analytica scandal it is time to have an honest talk about the self-regulations that we must define in our business.
With the prior knowledge that data is representing flesh and blood individuals, it is important that measures are taken to respect their identities and the personal information they choose to share online. As a growing industry we, together, need to gather ideas and agreements that guarantee the safety of the users that are part of the data, so we can continue to offer our essential assistance to organizations in a more responsible way.
On Maven Road, ethics is a priority. We understand the importance of not betraying the trust of users who introduce their private information on social platforms. Consistent with this business value; our company seeks to comply with certain measures that guarantee the privacy of analyzed profiles. Methods of solving these challenges involve anonymizing and restricting access to data, responsible data management, and encrypting sensitive information.
In our company, profiles go through a process of data selection and cleanse, after that, we proceed to anonymize the data so we ensure the privacy of profile owners. In a next step, themes and archetypes are defined and the data is categorized using only the information that users publicly share. Finally, we only share information that includes aggregate data, providing information about particular groups based on specific variables such as age, gender, or nationality.
We call on the industry to exercise responsibility when analyzing data. It’s imperative to open a dialogue channel between us and define a set of equally applicable rules that will help us improve our working methods and ensure responsible treatment of the gathered information. As more companies talk about these security issues, more and better ideas will emerge for the benefit of all.