"The primary mission of COPPA is to place parents in control of what info is collected from their children. As a parent, I find this impossible." - Cheri Kiesecker
What is COPPA? In 1998 Congress made some value judgments about the potential harm to children due to online activity. That it was important for websites that are targeted towards kids to collect parental consent before they collect, use, or disclose any of the child's information and data. The Federal Trade Commission (FTC), made some rules and suggestions as to how that would happen, and COPPA is the result of this work.
Show Notes and Links:
COPPA Workshop Video
Twitter: #COPPAworkshop
Great articles about the FTC workshop:
https://www.nytimes.com/2019/10/10/opinion/coppa-children-online-privacy.html
https://www.edsurge.com/news/2019-10-08-the-ftc-has-its-sights-on-coppa-and-edtech-providers-should-take-notice
Tech Tool of the Week: It's Digital Citizenship Week! Follow #digcitcommit. It has a ton of great ideas on incorporating this topic into your classroom. Also visit Common Sense Media for our #TechTooloftheWeek.
Danelle Brostrom 0:04
Okay, can I tell you I'm worried about this topic.
Danelle Brostrom 0:12
What, is today Tuesday?
Danelle Brostrom 0:14
Danelle was awful this week so
Larry Burden 0:17
it'll be. We'll see
Danelle Brostrom 0:19
Can that be your intro. Heck yeah
Larry Burden 0:21
So there we go.
Larry Burden 0:26
It's Episode 93 of the EdTech Loop podcast my name is Larry Burden and she wishes she was at the hottest spot north of Havana, it's Danelle Brostrom. It's really cold in here.
Danelle Brostrom 0:37
It's really cold in here.
Larry Burden 0:38
After reading the iching's terms of service agreement, we've decided to share publicly this week's moment of Zen.
Moment of Zen 0:45
They who can give up essential liberty to obtain a little temporary safety deserve, neither liberty, or safety.
Larry Burden 0:54
The rest of these essential ingredients are copyright protected, but we've, we've obtained the rights to add them to this week's meat of the show. The future of privacy. I had, I had a question to start out with but I really wanted, I wanted, I had these quotes and I'm like I want to start out with this and see where you go with it. So the quote this time was from Twitter, because Twitter, the primary mission, first, am I pronouncing right is it Copa?
Danelle Brostrom 1:20
COPPA,
Larry Burden 1:21
COPPA because they don't want it to be the Copacabana, which is the reference earlier.
Danelle Brostrom 1:24
I got'cha. Now I get it.
Larry Burden 1:27
There we go, and it's really cold in here. Did I mention it's cold in here?
Danelle Brostrom 1:29
It's cold in here.
Larry Burden 1:31
"The primary mission of COPPA is to place parents in control of what info is collected from their children. As a parent, I find this impossible." And that's from Cheri Kiesecker @cherkies. I do believe on Twitter, and I read that and I thought to myself. Yep. Yep. So, what is COPPA?
Danelle Brostrom 1:55
The original COPPA was passed in 1998, and it was enacted in 2000. So just, just think about that in terms of where we've come in ed tech first of all,
Larry Burden 2:04
First what is COPPA?
Danelle Brostrom 2:06
You're gonna kick me back there aren't you? Okay.
Larry Burden 2:08
Acronym time.
Danelle Brostrom 2:09
So, essentially Congress made some value judgments about the potential harm to children due to online activity, and so that it was important for websites that are targeted towards kids to collect parental consent before they collect, before they use or disclose any of the child's information and data. So the FTC, which is the Federal Trade Commission, made some rules and suggestions as to how that would happen, and COPPA is kind of the result of this work. So it really provides special protections for what they deem to be children which is actually under 13, which is another conversation that they had. But it's it really just tries to protect our littlest, our littlest, littles for thier online activity. So it's, what I was saying is it's interesting because the fact that this was thought of in '98, when the internet was really just starting to gain some traction and then enacted in 2000. That was 19 years ago. There was a small update done in 2013 which I want to talk about later but there's a lot that has happened since then. Which is why, I think, last week, the reason this topic came up on my radar at least. Last week the FTC had a workshop called "The Future of COPPA Rule," and they had an entire day's worth of some experts who were talking about this digital playground idea. And they had panels, they had experts, it was just amazing and fascinating and that's actually why it was late to record the podcast today because I was sitting in my office watching the tape from last week, so.
Larry Burden 3:43
You are correct, nerdy.
Danelle Brostrom 3:44
I know so nerdy, but I will link it up in the show notes because I think there's some really good, there's some good information and there were some sessions that were really really good so I will point those out.
Larry Burden 3:54
It seems to me that in the early 2000's, actually a little bit ahead of the game.
Danelle Brostrom 4:00
For sure, oh yeah.
Larry Burden 4:01
At that point in time, you know, some very smart people saw that there was going to potentially be some issues. As we've gone through the next two decades, it seems like we've kind of dropped the ball. Fumble!
Danelle Brostrom 4:11
We have, we have dropped the ball. But it's, I don't think it's really our fault. I think the, if you look at the, even just the amount of mobile devices in the home from when this was enacted to now, I mean you went from like 41% had smartphones to like 95% have smartphones and they're putting their kids on them. Just that tidal wave of an increase in devices. It was dramatic, and it was game changing, and I think we as parents and teachers are trying to keep up. I think our pediatricians are trying to keep up. The researchers are trying to keep up. And everyone's just trying to figure out how to manage it all. I don't think that we meant to drop the ball. I think that it was just all of a sudden the tsunami was coming at us and we were like, Oh, well, I wasn't prepared for that. So I think that in, I mentioned that in 2013 they did do some updates to it. They said things like, hold on I'm going to grab my notes because this is a really really meaty topic.
Larry Burden 5:11
There's a lot of notes,
Danelle Brostrom 5:12
There's a lot of notes. But they did increase the protections. They said that we are going to talk about new forms of PII. So, so, PII is, are all the things that we use to, to identify kids, so they are now talking about behavioral advertising, so identifiers that can recognize the user over time and across websites, that that can be considered PII. And photos, and videos, and audio files can be a kid's PII. So if you're putting an audio recording of a kid online, that's their personal identifiable information. Geolocation is also included in that. They increased some new forms of parental consent. So some new ways that parents can say yes I approve of this site or No, I don't. And then they also increased the coverage to talk about like, Internet of Things devices, like little vtech tablets and things like that. And platforms like YouTube were included for the first time. So, it was good that they made some changes there and I think, and they talked about this in the, in the panel, that there's, there's a lot more that they still need to do. And some countries are further ahead than us. Like looking at the GDPR that's over in Europe. They're a lot more strict, and they do provide people more control over their data. And California inacted some laws to increase protection for kids under 16, and that if you're a business you have to have like an age gate on your website that says are you for sure over 16m, and they have to opt in or opt out. It's not perfect, but I think
Larry Burden 6:41
Sounds, sounds less than enforceable.
Danelle Brostrom 6:44
It's, yes it is unfortunately. But I think the fact that people are talking about this, and people are asking for more regulation is just showing that we don't quite understand, we don't quite trust what companies are doing with our data. So, it's good that people are talking about this. It's great that they had this entire day of experts talking about this rule and how they can make things better for kids.
Larry Burden 7:07
You know briefly going through the thread, the corporations don't have a handle on it, the users certainly don't have a handle on it. I think there's a, there's a lack of understanding on some of the basic issues, probably because it is going so fast, I forget the TED talk, it was the Google Designer he was, a former Google Designer,
Danelle Brostrom 7:30
Okay,
Larry Burden 7:30
That brought up that really attention right now is the commodity. That's the currency, and I don't know if, I don't even know if companies really recognize, you know, or the powers that be really recognize that that is the case, even though they are marketing to it. One of the stats that did come up was, you know, 53% of kids ages three to four have already been online and have an online presence, and by the age of 18 every person has at least 70,000 data points, trackable data points. And that I have to say, I wouldn't doubt it that's old data.
Danelle Brostrom 8:13
Yeah,
Larry Burden 8:14
It came in reference to a particular business owner that mentioned that things aren't marketed, that their digital devices or online devices, aren't marketed to kids. Either, that was just a lie, or there was a lack of recognition that since most, most things online are based off of algorithms, anytime something is getting tracked and the, that data is getting logged the algorithm is going to feed marketing to the child. They're going to give them what they want based on the clicks and those data points. So even though this device isn't specifically marketed to the child, which I don't necessarily buy,
Danelle Brostrom 9:00
Right
Larry Burden 9:00
either, but it's a soft currency, even as a soft currency, that being attention, the algorithm is marketing to the child. Specifically, the individual child. We are allowing that, that currency to be spent.
Danelle Brostrom 9:18
You should,
Larry Burden 9:19
I told myself I wasn't gonna get on a soapbox and that took about five minutes.
Danelle Brostrom 9:23
You should go back Larry and watch the section of the FTC workshop. The first person who spoke, her name is Jenny Radesky, and she's from U of M medi, medical school. She was fascinating with the research that they've been doing down there on App design, data collection and policy implications. They did a content analysis of 135 apps that are marketed or played by children that were found in the kids section of the App Store. So, these aren't necessarily ones that were recommended for kids, but they were one that they said, sure kids can play 'em. We'll throw, throw them in the kids section. and they...
Larry Burden 10:00
Not to interrupt but, they're not recommended for kids, but if a kid is on the app store, and the data points that they're placing on the App Store lead them there, guess what, it is marketed to kids.
Danelle Brostrom 10:12
Right, and they found on those apps that were marketed towards kids, lots of pop up ads, bannered ad's, sponsored content. And some of them said things like sponsored content, but they're marketed towards preschoolers, and preschoolers can't read. So you're trying to get them to click on those things. Some of the ads even had inappropriate content for kids, guns and knives, adult themed content. The ad networks aren't filtering out the adult ads, and they're also using an adult centered design. They're trying to apply that to kids, and kids can't, kids can't distinguish, all the research shows that kids, even school age and school age and under, they can't distinguish what's going on with that advertisement. So applying an adult centered design to kids apps is really something we as a consumer need to, need to fight against because kids deserve content that is designed for them, and that is safe for them, and that is just not going to put them in situations where they're seeing all these things. That's not fair.
Larry Burden 11:13
That leads me to the question, so what is age appropriate design? I saw that a lot, in a lot of the, the tweets. What does that mean?
Danelle Brostrom 11:22
I would look to PBS as a leader in that. If you look at the stuff that PBS is putting out they are always thinking of the end user and the child. I mean, man, it goes back to Mr. Rogers Neighborhood ya know. I mean, I know but that was designed for a child, and what a child needs, and I think that, I think as a culture and as a society we need to recognize that kids need something different and kids deserve something different. And even if you look at the number of kids that are playing, we talked about the content that was marketed towards kids but a lot of the kids are playing the content that isn't marketed towards them and how do we, how do we keep them safe. I think it's our duty to give kids a better experience and I would look to PBS as a leader in that.
Larry Burden 12:09
So a couple things. Talking about Mr. Rogers
Danelle Brostrom 12:13
Awe, see. I did it to.
Larry Burden 12:16
And I want to bring up something called the Children's Fire. I think we may have talked about this as well. By a, it was a book by Tim McCartney, and he also had a talk and we'll link in the show notes to the talk. So I think it's really important. And the general concept, ya know, not to go too deep into it, is a healthy society puts the child first. And when they're making decisions on an adult level, the first question that's asked, before any other question is, is this good for our children? So when we're talking about age appropriate design, when we're talking about algorithms, when we're talking about any of this, on the highest level, on that, to be honest, on that corporate level, when they're when decisions are being made about what is, what is appropriate, or what is healthy, for our product, long term, really the first question that needs to be asked or should be asked is, is this good for our children. Until that happens a lot of the conversation around COPPA is going to probably miss, or is going to be hard to enforce, or hard to establish, because the will behind it isn't there. PBS is doing a very good job, but if the algorithm isn't leading the child to PBS, the child's not going to see it. Their decisions are going to be led some place else. And as Sherry Keisecker had said, "as a parent, I find it impossible," to keep track of those data points and really manage my child's online identity, without it being a real community effort.
Danelle Brostrom 13:59
I think it has to be. I don't think it's fair to put that on parents. And I think that's what our tech companies and our app makers have said. They've said, well parents need to use this monitoring software, and parents need to do a better job of being next to their kids when they're using their device to make sure that they're doing what's appropriate. Yeah, we should, but, wow, that is so ridiculously hard and you can't put another thing on my shoulders because as a parent, I am trying to do a million things and you've got kids with these devices that can move from room to room. It's not just sitting down and watching a television program together and talking about it, it's,
Larry Burden 14:35
They're not wrong. I mean, they're not wrong,
Danelle Brostrom 14:37
They're not wrong but man,
Larry Burden 14:39
but it should
Danelle Brostrom 14:39
help us out a little bit.
Larry Burden 14:40
Exactly, exactly. It's really easy for both the parent and the corporation, or the school, or whatever you know, entity, it is to say, it's somebody else's fault.
Danelle Brostrom 14:50
Yeah, right.
Larry Burden 14:52
And I'll go back to the, the Children's Fire, until all the parties involved start with the question: is this good for the child? we're probably going to miss the mark. It really does affect how our children are perceiving the world, and how they're educated, and how they grow, and what society they grow into.
Danelle Brostrom 15:14
And I think, EdTech needs to be talking about this. I know that sometimes when I have this discussion with my teachers, they're like: What are you doing, everyone else is using this app, nobody's talking about this, why are you, why are you bo..., why are you putting a lid on this? And I feel like, okay, not everyone is talking about this yet, but man, we have to be, especially in EdTech. ISTE did a survey and districts with over 1000 students, which I mean, we're over 10,000, but districts with over 1000 students are using approximately 548 different ed tech tools. And that came from the Project Unicorn site which if you don't know Project Unicorn, they're amazing. They do a lot of work with interoperability, so how can all these things talk to each other so that it doesn't end up being more work on the teachers part to collect all this data. It's another whole thing we'll talk about another time. But, but, man just 548 distinct ed tech tools. Think about all that data, and all of those kind of issues.
Larry Burden 16:11
In a district a tenth the size.
Danelle Brostrom 16:11
Yeah, all those companies are, their marketing to schools and schools are signing up because this is a great way to get kids engaged, or this is going to help with this one problem I have in the classroom, or this is going to make me more efficient, or whatever. But you can't rely on those tech companies to protect our kids data. We have to be, we have, we have to be on it, and we have to check it, and we have to use something else if that doesn't do what we want. I think this is a discussion worth having and especially, especially in Ed Tech.
Larry Burden 16:41
No matter what we do there's data points.
Danelle Brostrom 16:43
Yes, but we can do a better job for our children. Our children deserve better than what they're getting right now. They deserve to have control of their data. They deserve to say no, I am, I am six years old and I shouldn't be tracked and marketed towards. We, like what you talked about, we need to do better for our children if we want them to grow up in a society that just makes sense and it's good for them.
Larry Burden 17:08
Well I think we already have lost or don't have a handle on the ways in which our simplest attention is affecting the environment that's put in front of us. It's not 50's advertising, it's a lot more subtle, it's a lot softer, and a lot more invasive, and we don't see it and it, we certainly don't see it for our kids.
Danelle Brostrom 17:31
Here's what I think is encouraging though. There are experts looking at this. There are experts that are talking about this. And they are trying to make this better. That's important, I think five years ago, if you went to, go back to ed tech, if you went to any of your big ed tech conferences, this was not on anyone's radar at all. So I think the fact that you will see sessions about this, and people are talking about, about data and how we can do better for our kids. That's encouraging. So this isn't Black Mirror.
Larry Burden 18:00
Thank you. Thank you. So what were some of the findings?
Danelle Brostrom 18:03
Some of the work being done, just that. Well, they definitely need to look at COPPA again, and make sure that they're covering all the things that they want to cover with it. There was a lot of discussion about, is 13 the appropriate age? Should it be higher? No decisions were made, it was just a discussion among experts. And then I think one of the big things that they kept reiterating was that there is some potential harm with using emerging technologies, and they were concerned that people were going to go Oh, this is bad, we're out. But there's so much good there as well. So, I think you can't get freaked out and totally cut yourself off from everything which we talk about that a lot on this podcast. There's a lot of good things about, about e-learning that we can, you know, data can help support teachers make better decisions and we can really zero in on what that individual child needs, and we can do a lot of good things with letting kids be creative...
Larry Burden 19:08
Data is not necessarily bad.
Danelle Brostrom 19:10
No, data is not necessarily bad, and all those online things that we used to do those, those great things they do collect personal data, but we just need to be cautious. But the fact that it's not all bad I think that was a big thread throughout the conference. And then they also talked about whether schools and teachers can continue consenting on the behalf of parents. Because right now they can. There's a school official designation or something like that where I can say that yes we need this for school, I'm going to consent on the behalf of parents and let my kids use this. But parents should really be in control of their minor status. So there was a lot of discussion about that. I think that the FTC is going to revisit this and that, you know, time will tell what kind of protections we can put in place for our kids but I think it was encouraging.
Larry Burden 19:57
The fact that the discussion is happening, is a step in the right direction.
Danelle Brostrom 20:01
We're talking about it on the podcast...nerds.
Larry Burden 20:06
There's so much more meat on this bone,
Danelle Brostrom 20:08
No I'm, I think there are definitely some articles and some images from the recording that I want to link up and the recording and I'll do that in the show notes.
Larry Burden 20:17
Going back to the "what is age appropriate design," as we're bringing our children through the school system, incorporating appropriate digital literacy as they're going through. When they get to 13, the hope would be they're more literate, digitally literate and able to handle some of the decision making regarding data and privacy at that point because they've been made aware, and educated appropriately regarding it.
Danelle Brostrom 20:45
Boom. Can you TechTool of the Week me?
Larry Burden 20:47
Is it, is it that time. Is it TechTool of the Week time?
Techtool of the Week 20:54
It is Digital Citizenship Week, and there is a growing hashtag on Twitter. #Digcitcommit, where people are talking about what they can do in the next however long to increase their awareness, or their students awareness of digital citizenship. So, I, I just have to talk about Common Sense Media again. I think that's my TechTool of the Week that is an amazing resource, and no kidding, every single time I go there to look for something I find something new that I didn't know was on there and I get really excited about it. There are lessons that they offer for educators K-12 are fantastic. They're short, they're to the point, and they cover a wide range of topics in digital citizenship and media literacy. One of the things that they found in their research is that we talked to kids about...if you ask them about this privacy and data collection all that stuff, they, they know about passwords, they know that they should keep their password safe. They know that photo sharing is an issue. they know that they should ask people before they share their photo online. But, though, they don't understand the complex things about that you know a school has your data, and social media platforms have your data and all of those things. There are Common Sense Media lessons around. So I think the more that we can talk about those, those topics that are outside the norm. Common Sense Media is covering them. And they're talking about how images are altered and retouched and how companies market to you, and I just, I love their resources for educators. And what I found last week, though I got really excited about, they have some ready made presentations. So if you, or someone in your school district wants to talk to families about this but you think you're not an expert, you don't quite know all the information. Wow, none of us are experts. None of us know all the information, but opening up that conversation is a great place to start, and Common Sense Media has some amazing resources and presentations and question guides that you can use to start that conversation with families. So, Common Sense Media is my TechTool of the Week.
Larry Burden 22:53
Tutorials and updates. The TechNollerGist is just absolutely out of control. Power Teacher Pro and Bright Arrow introductions, Ed Tech Standards Framework, Help Options and more. And I'm thinking about Instagram. I haven't quite figured it out yet but I'm looking at all these tutorials and I'm like, you know what, I wonder if we can do something with those on Instagram. So, you know feel free to give me any feedback.
Danelle Brostrom 23:20
I'll read the Privacy Policy and then I'll tell you.
Larry Burden 23:23
Well played,
Danelle Brostrom 23:24
boom,
Larry Burden 23:25
In closing, follow us on Facebook and Twitter @TCAPSLoop,
Danelle Brostrom 23:27
@brostromda
Larry Burden 23:29
Subscribe to the podcast on podbean, iTunes, Stitcher, Tune-In, downcast ,overcast, the Google Play Store or wherever else you get your ear candy. Please leave a review, we love the feedback. Thanks for listening and inspiring.
Larry Burden 23:47
I'm going to edit a decent amount of that out.
Transcribed by https://otter.ai
Create your
podcast in
minutes
It is Free