
By Joshua Anderson
From Facebook data breach lawsuits to the enactment of the General Data Protection Regulation (GDPR), data privacy has been an intensifying topic in recent political discourse globally. The expanding issue has revealed the lack of awareness in the general public around big data. Additionally, recent data privacy controversies have revealed the younger generation’s complacency and disregard for data privacy in the age of unregulated technology.
TikTok, a short-form video social media app, recently entered the spotlight of the data privacy discussion. On August 14, 2020, President Trump issued an executive order requiring ByteDance, the Chinese government-owned parent company of TikTok, to cease operations within the United States if they do not sell TikTok to a U.S. entity by November 12. Regardless if this move by the Trump administration was an essential step or a ludicrous misstep, this is yet another major plight in the world of data privacy.
The Trump administration, along with many in Congress, has accused TikTok of sharing user data with the Chinese government, which would violate the privacy and security of U.S. citizens. The Trump administration’s accusations stem from Beijing’s legal ability to require the acquisition of user data from ByteDance, but all allegations of this have been denied by ByteDance. In light of this conflict, the Chinese government concurrently amended its export control rules for the first time since 2008. This change affected 25 different categories of goods, including artificial intelligence assets such as TikTok’s user interest algorithm. Therefore, this would significantly complicate a sale of TikTok to a U.S. firm.
In response to these complications, a preliminary deal was confirmed on September 19 with Oracle and Walmart. Since the deal is not yet finalized, TikTok is still at risk of being banned. On September 27, a federal judge delayed any change from happening for the U.S. user base of over 100 million people until a full court hearing.
This is not the first time a camera-based app has raised international data privacy concerns in the U.S. In 2019, the United States F.B.I. investigated FaceApp, a popular photo filter app created by the Russian based company, Wireless Lab. The investigation was based on suspicions of user images being sent to the Russian government. The suspicions came from a clause in FaceApp’s terms and conditions that established their right to modify, reproduce, and publish any of the images users processed through its artificial intelligence algorithm. This case provides a precedent that tech companies can be used by (potentially-hostile) foreign nations to spy on and gather information from U.S. citizens.
Big Data
Big data is defined as “an accumulation of data that is too large and complex for processing by traditional database management tools.” Companies expend tremendous amounts of resources to obtain as much data as they can about their users. TikTok in particular has massive amounts of video data posted by its users in addition to implicit data like user location, age, etc.
Why would companies pay so much attention to large and overly complex sets of data?
In contrast to the general public, technologists and business owners understand that data not only has immense monetary value, but also informational value. To these people, data is seen as unprocessed information that can be extracted into compelling insights. Because big data is too complex for any one human to extract insights from in a traditional manner, tools like artificial intelligence – which can offer predictive qualities and causal relationships – have become increasingly popular. This information that can be extracted from big data is now easily accessible to any person who has access to a computer and time to learn Python on YouTube.
With advancement in technology – particularly data science – the power of image processing has increased exponentially. Images contain massive amounts of information that, with new technologies such as Convolutional Neural Networks (CNNs) and advancing computer hardware, can wield unprecedented results. At Chapman University, Dr. Erik Linstead, Associate Dean of Fowler Engineering, and members of the Machine Learning Assistive Technology lab (MLAT) recently published a study, “A Deep Learning Approach to Identifying Source Code in Images and Video,” that utilizes this technology to analyze thousands of video images. Their goal was to identify whether any given image or video contained code – specifically Java – handwritten, typed, or even only partially visible. The models used achieved 85.7 to 98.7 percent accuracy in their classifications.
Now, taking a look at the content of TikTok, technologies such as CNNs can be used to classify objects in videos from the tie on someone’s neck to the model of a car on the street. Specific inquiries about a location, person, item, etc. can be answered given enough videos of it. This is not even taking into account the IP addresses, geolocation-related data, browsing history, and other user data that TikTok states in their terms and conditions that they automatically collect.
Privacy
The general sentiment around the banning of TikTok – especially among the younger population – is something along the lines of: “They have all my data anyways, so why does it matter?” Indifference can be dangerous. In addition to the aforementioned national security risks, the evaporating care for personal privacy has historically led to declining defense of privacy in how the courts interpret our legislation.
Katz v. United States (1967) was a pivotal application of judicial review establishing the main reference for understanding privacy in the modern era. In the majority opinion, Justice Potter Stewart wrote:
“For the Fourth Amendment protects people, not places. What a person knowingly exposes to the public, even in his own home or office, is not a subject of Fourth Amendment protection. But what he seeks to preserve as private, even in an area accessible to the public, may be constitutionally protected.”
Not too long after, the case Smith v. Maryland (1979) ruled that telephone companies could record phone numbers contacted by a customer rejecting the idea of a “legitimate expectation of privacy” saying, “We doubt that people in general entertain any actual expectation of privacy in the numbers they dial.” This ruling shows that the culture and expectations of the country around data privacy have influence on how we are governed.
In response to an increasing value of private data – especially regarding data in the 21st century – we have seen major advancements in legislation like GDPR, which also inspired the California Consumer Privacy Act (CCPA). These laws add immense restrictions on what a company can do with a customer’s data without their consent. While some people may have differing opinions on if these regulations are productive or counterintuitive, the support for legislation such as GDPR shows a collective desire for privacy concerning personal data.
Our expectation of privacy has an effect on the laws that govern us, and there are many more amoral utilizations of data than the average person realizes. TikTok is only a chapter in the story of modern data regulation. The rising generation’s posture towards data privacy – whether it be indifference or acknowledgment – will be ever more important in the biggest decisions yet to be made in data regulation.
Joshua Anderson is a first-year graduate student at Chapman University studying Computational and Data Sciences. He is a technology columnist for The Hesperian.