Students from low-income families are most likely to attend schools that do not systematically research educational technology, putting their data privacy at greater risk, a new analysis finds.
An analysis by Internet Safety Labs found that apps used by these schools are most likely to contain advertising.A non-profit group researching technology product safety and privacy.
The same problem occurred in schools with majority American Indian/Alaska Native students.
This despite the fact that these schools encourage or require students to use fewer apps on average than their wealthier counterparts.
Schools serving low-income students are three times more likely to recommend or require apps containing behavioral advertising than schools serving students from families earning more than $150,000 per year.
“Even though they were recommending fewer technologies, they were getting the technologies that were behaving the most riskily, and it was very disappointing to see that in the data,” said Lisa LeVasseur, executive director of Internet Safety Labs.
The analysis also found that schools with predominantly black students were most likely to have ads and trackers on their websites.
Together, this adds a potentially worrying level of data collection on students from minority and low-income families, LeVasseur said.
LeVasseur said ads for educational apps that collect data about students may seem innocuous. However, collecting data from the various technologies students use at school and in their personal lives and transmitting it to third-party data brokers is a major concern.
Data brokers collect detailed profiles of everyone who uses their technology, which they then sell to third parties. LeVasseur said people have no control over who buys their data and how it is used.
These profiles may include sensitive information such as the user's religion and gender, as well as location and movements, and details about their physical and mental health, she said.
Creators of software programs and apps used in schools often point out that user agreements for their technology are clear about how data is collected or shared. They encourage people to read those contracts carefully.
Still, personal data can be used in ways that are difficult to predict, LeVasseur said.
Children, especially those under 8, have difficulty distinguishing between advertisements and media containing advertisements, child development experts say.
Even if tech products marketed to children claim they don't sell children's data to third parties, they're likely still making money from that data.According to a 2023 analysis by Common Sense Media, a nonprofit research and advocacy organization.
Low-income schools use some of the most dangerous apps, but there is a simple solution
Internet Safety Labs conducted an extensive audit of educational apps in 2022. District samples recommended or required for use by students. The sample included 663 districts in all 50 states and the District of Columbia. And the original analysis found that schools were using many apps that did not protect students' data privacy or were rated “very high risk” by Internet Safety Labs.
This latest report analyzes the original audit to find differences between schools based on student numbers. (One caveat: the organization said it would like the sample from the lowest-income neighborhoods (18 in all) to have a larger and closer sample size than the samples from other neighborhood groups. Nonetheless, the organization said it was confident in the overall results. I have .
Internet Safety Labs finds that, overall, apps that are screened at the school or district level have fewer contextual ads (ads that serve ads that are relevant to the content of the website or app the user is using) and fewer behavioral ads. It appears to be recommended or required. (Uses data collected from users to target highly personalized advertising).
Additionally, the following findings were made:
- None of the lowest-income schools in the sample, which primarily serve students from families earning between $20,000 and $39,000 per year, had conducted a systematic survey of the technologies they recommended or required students to use.
- Schools that primarily serve students from families earning more than $100,000 per year are more likely than low-income schools to survey the technology their students use.
- At schools serving low-income students, 9.8% of recommended or required apps included contextual ads and 9.5% included behavioral ads, compared to 2.7% at the highest-income schools.
- Schools serving low-income students were less likely than higher-income schools to provide families with technology notices that clearly listed all technology products students should use.
- Schools with majority Black students experienced the greatest data privacy issues on their school websites.
LeVasseur said that despite his leniency about what analyzes are considered screening, there are instances where this is the case.
But even basic research seems to work. “If it's doing anything, it looks like it's filtering out apps that contain ads and behavioral ads,” she said.
This will be encouraging news for under-resourced schools. LeVasseur recommends at least checking if your app has the COPPA Safe Harbor Seal. Before encouraging or requiring students to use it. Pursuant to the Children's Online Privacy Protection Act, the Federal Trade Commission has authorized certain groups to develop COPPA Safe Harbor certification following the guidelines set forth in the Privacy Act.