So, why was GDPR introduced?
Prior to GDPR, laws were written for a world without smartphones that could collect massive amounts of sensitive information for companies such as Google and Facebook. GDPR now provides companies guidelines on how they may utilize personal data, while giving users clarity on how their data is being used.
Legislators in the United States are working on regulation that would be similar while also monitoring GDPR’s effects. No matter where you are located, however, GDPR impacts companies and users everywhere. Although it’s only law in the EU, it’s become a de facto world regulation.
But, what exactly is personal data under GDPR?
GDPR was designed to protect the data of European users, but because the “cloud” is not on one computer and software services have a global reach, GDPR takes into account all EU users even if they work internationally. Any business hosting personal identifiable information (PII) – any data that can identify you such as your name, email address, social security number, picture, phone number, username, location, and internet protocol (IP) address – falls under GDPR’s supervision.
Well, how did the US react?
Similar to the GDPR, California passed the California Consumer Privacy Act (CCPA) of 2018 – which will go into effect on January 1, 2020 – affecting how personal data is collected, processed, and shared in California.
The CCPA was designed with three major themes: ownership, control, and security.
- Ownership gives users the right to know what personal information is being collected and whether that personal identifiable information is being sold, or disclosed, and to whom.
- Control gives users the right to say no to the sale of personal information and the right for equal service or price; so if you opt out of a sale, you will not be penalized. If the principle of control sounds similar, it’s because the Federal Communications Commision (FCC) put into place rules to prevent internet service providers (ISPs) from selling your data without obtaining an opt-in. CCPA reinstates this legislation at the state level, requiring the ISP to ask you before they can sell or market your personal information.
- To uphold security, a business that suffers a breach of their system will be penalized up to $75,000 for each violation for each affected user. Although this isn’t as strict as GDPR, it’s more than just a slap on the wrist.
Even though that CCPA is only in one state right now, it may be the most impactful start to a GDPR-like act in the US.
Ultimately, where are the ethical lines?
When data is used in ways that benefit others while adversely affecting you, ethical problems will arise. Complying with changing privacy regulations is stressful for companies, as well as a drain on resources, but many are embracing it as an opportunity to increase trust and transparency.
As we enter into the age of machine learning, artificial intelligence, and facial recognition, your data profile stems from your social network activity. When it comes to our data, many Americans see this as a black-and-white issue. In fact, an overwhelming 63 percent of Americans believe that social media platforms have far too much power.
But, how can data collection be immoral when it serves as the backbone of so many of these services we use every day? How many helpful job recommendations have been given by software that matches job seekers’ skills and attributes? How many human connections have been built through recommendations on social media platforms such as Facebook or LinkedIn?
Social media, particularly with Facebook and Twitter, has been found to reflect people’s personality and intelligence as well as characteristics such as sexual orientation and political views. So, could it be ethical to mine this data for hiring purposes when users typically used these online applications with a different intent – and therefore, without consent for data analytics to draw conclusions from their social media postings?
Federal legislation was recently passed, via the Algorithmic Accountability Act of 2019, which intends to prevent inaccuracies, bias, and discrimination in automated decisions – particularly in the hiring process. So, as the adage goes, “great power does come with great responsibility”. Data and its collection is not the issue – but rather the improper use of it is.