Consumer rights aren’t guaranteed in a digital world, warns Consumer Reports CEO

In today’s technology-driven economy, Unsettled digital devices make consumer rights more difficult to enforce.

That’s the new book, “Buyer Awareness: Harnessing Our Consumer Power for a Safe, Fair, and Transparent Market,” by Marta Tellado, president and CEO of Consumer Reports, a nonprofit that does product testing and other consumer advocacy work. As the use of algorithms and artificial intelligence increases, the potential for unfair practices that consumers are unaware of, she said.

“You have to make decisions every day based on things you can’t see, feel or touch,” Tellado said in an interview with Marketplace’s David Brancaccio. “This is incredibly challenging for everyday consumers.”

The following is an edited transcript of their conversation.

David Brancaccio: Maybe you think your job is to review robot vacuums or remind us again that Toyotas are good. But do you see your professional work as a civil rights hero on some level?

Marta Tellado: ok that’s right. I think a lot of people know us because of the ratings. But what we do is really shape the market to create a fairer and safer market. What I really tried to do in the book “Buyer Aware” is to tell a bigger story about our democracy that can only flourish if we have a market: fairer and more convenient for all consumers.

Brancaccio: All consumers – no one wants to be taken advantage of by the companies we interact with. But also importantly, I think reading your book you see the issue of equity. Exploitation is bad for some and not for others.

Tellado- ok that’s right. And for me, I think coming to the United States planted the seed as a young immigrant child. [the] In addition to the revolution in Cuba, my parents had to rebuild their economic lives and have first-hand experience – and I’m very grateful to have come to a democracy. But I see firsthand that economic freedom is a civil right, and you can’t have a fair market if there’s an inherent bias. When our economic power and our agency are eroded; We have the power to act as free and equal members of our democracy. So here are some examples that show how powerful the book is. It tries to identify some really bad examples and the strong connection between free and fair democracy and economic opportunity in the market.

The challenge of digital consumer rights

Brancaccio: When thinking about civil rights and equality as it relates to consumer rights, for example, I think some people are talking about borrowing land from Native American tribes or broadband internet access. But it goes deeper. I mean emerging technology, Are you too concerned about artificial intelligence’s built-in biases?

Also Read :  The long-awaited US broadband internet maps are here — for you to challenge

Tellado- Well, I asked. Proud of all the work, For over 86 years, we have worked to enact laws and regulations for fairness and justice in the marketplace. But unfortunately, those rules don’t apply to the new, less transparent digital landscape. Algorithms and invisible; Every day we have to make decisions based on things we can’t feel or touch. This is incredibly challenging for everyday consumers.

Brancaccio: I spoke with the chair of the computer science department at the engineering school. He pointed out that with machine learning, even computer scientists cannot reverse engineer their own system to fully understand why a device decides to do one thing or another. And you can see how that can lead to abuse and discrimination.

Tellado- That’s right. When you think about it, machine learning: the bad data bias comes out. And I devoted an entire chapter to what it means when an algorithm discriminates against you. Sometimes it’s a life and death situation. Let’s consider how clinical bias appears in an algorithm. If you go to the doctor and find out you have end-stage kidney disease. This means you will need a transplant. And we know that transplants are in short supply. Must be on a national waiting list. How did that happen? ok for all of us You must be qualified. The way you qualify is you have to score 20 or below, and it’s a test based on your medical data that shows how quickly your kidneys filter blood. But if you’re black, it’s a flaw that adjusts your score. There is a race-adjusted statistic based on a flawed study done in the 90s — bad data. It is thought that blacks may have different kidney function. So let’s call this patient Eli. He went in for the exam. He doesn’t get 20 or less, he doesn’t cut it. It’s an algorithm that’s not transparent to the patient, and it’s a life-and-death situation, making decisions about that patient’s ability to access medical care, and in this instance, Life-saving medical treatment.

Brancaccio: Now I know you’re looking for this in your book — I don’t think you did. Which federal regulation governs fairness in artificial intelligence?

Tellado- Unfortunately, No federal rule. As proud as we are of the work we’ve done in consumer rights and protection, Those rules and regulations and protections have not migrated to the digital landscape for consumers. It rears its head in many ways, but artificial intelligence is not one of them. We may have great leaders in our agencies right now, but we don’t have the tools or capabilities or guidelines that provide greater fairness and transparency. And you cannot believe what is not transparent to you. Certainly cannot be held accountable.

Also Read :  Samsung, Apple, and other brands reduce mobile phone production

Potential discrimination… hidden in algorithms.

Brancaccio: Consumer Reports is involved in the public policy process — has your organization been heard about such an issue?

Tellado- ok Many people come to us because they make individual choices. What we look at is, “How do those options generally move to market? We have a digital lab that looks exactly at bias. We also looked at car insurance; You think your car insurance is based on your driving record, whether you don’t have a ticket or the fact that you don’t have that coverage. But in fact, The algorithm also looks at non-driving factors about you: where you live; Your income What is your educational level? So what we’ve discovered is that the price you pay for your car insurance has more to do with your zip code; The environment is black, whether white or Hispanic; It determines what you will pay for that premium. So black and Hispanic neighborhoods are paying higher premiums than white neighborhoods based on that bias.

Brancaccio: That is, Even language barriers prevent companies from communicating important information in languages ​​people can understand.

Tellado- As David said, In some of these instances, the stakes are really high. They are life and death. And I’ll give you an example of something that’s not an algorithm. But it’s a product. A product that everyone knows. Every time I go to the hospital, I see a clip on someone’s finger. It’s a pulse oximeter. But it doesn’t work well on darker skin tones either. What we do know is that people of color are three times more likely to miss low oxygen levels than you are white. And because you show it in the ER and test it, the effects are pretty amazing. If you don’t have a score, you turn to the ER and it’s really remarkable what we went through during the pandemic. And we know that many people of color are turned away and have disparate impacts. Therefore, again, Fairness by Design is something we also look at in those Consumer Reports.

Also Read :  How to Launch a Green Computing Initiative That Really Makes a Difference

Brancaccio: For these devices, you may find engineers saying, “I never thought to check or think of that.” And what you’re asking is, “You think so, companies.”

Tellado- ok that’s right. Another area where we see a lot of bias and we’ve known this for a while is that women are injured a lot more in car accidents and impacts on your car. The reason is our biology. Because our bone structure is very different from the male body. However, the test dummies were not anatomically correct. They are based on men and the impact of forces in an accident. It’s still a battle for us. But that’s part of what we do — we have to remove bias and create products for all, so we test these products.

Brancaccio: Marta, Have some of you ever pushed this? Is Blender good or bad?

Tellado- We get that all the time. People are like, “Well, wait, you know, stay in your lane, you know, tell me what car to ride or what kind of blender.” David’s stakes are very high. As you said, We are a non-profit organization. We are not a private publishing company. We, like public radio, are powered by our members. That’s why we are a good public. We make the world and market better for consumers; We collect data because we want to be a fair and transparent place. That’s why our work creates the foundations of many of the safety needs you see and incorporate into your home, and our work really helps. We still have the burden of the hardware products we bring into our homes. And the burden on consumers in the digital landscape is real. So now we live in a world where our privacy and the security of our data is not the right setting.

There’s a lot going on in the world. All that, Marketplace is for you.

You rely on Marketplace to digest world events and how they affect you, based on data. Tell it in an approachable way. We rely on your financial support to continue to make this possible.

Your donation today strengthens the independent journalism you rely on. For only $5 a month. You can help keep Marketplace going and we can keep reporting on what’s important to you.


Leave a Reply

Your email address will not be published.

Related Articles

Back to top button