Sunday, December 3, 2023

Driverless Cars to Drone Deliveries: 7 Legal and Ethical Hurdles in Consumer Technology

Share

Federal and state governments are supposed to use their regulatory powers to make sure that consumer technologies are sold to the public safely and ethically. But technology is now expanding its capabilities more quickly than ever, and the relationship between innovative businesses and their regulators isn’t always a smooth one.

The following are just a few of the thorny issues that businesses, attorneys, consumers and governments will find themselves dealing with in the near future—if they’re not already. These seven questions will have major implications for how we relate to our tech in the future and may even decide the winners and losers of the 21st century economy.

  1. Who assumes liability for a driverless car?

Driverless cars are nearly here, and their potentially sticky legal implications are quickly becoming apparent. The biggest legal issue facing the autonomous car industry? Liability.

There was a time where, from the moment a driver picked up that little plastic key fob, they were in charge of the car and liable for any accidents. With new automation technologies, though, the game has changed. When using these cars, it will be important to remember that even fully autonomous cars will, for a while, require an alert human driver to pay attention to the road.

But even that raises serious questions about who will be held accountable if an autonomous car crashes—the driver/passenger or the automaker? As of yet, we don’t have a satisfactory answer.

  1. What can a genetic testing company legally do with your DNA?

Had it not been for the newly popular home DNA testing industry, police might not have caught the infamous Golden State Killer. But their methods raised some eyebrows. They used the DNA database GEDmatch, which catalogs user-submitted data from services like 23andMe, to find one of the killer’s relatives and match their DNA with a sample taken from a crime scene.

Obviously, apprehending violent criminals is a win for everyone. But it’s raised very real privacy concerns about who has the right to access your genetic data. The kicker is that for most such databases, your exact DNA doesn’t have to be uploaded. All the database needs to pull you into its net is the DNA of your third cousin or closer.

  1. Can you call it meat if it’s grown in a lab?

The advent of lab-grown meat (also called cell-cultured meat or clean meat) offers incredible potential for fighting hunger and mitigating the environmental impact of livestock farming. The technologies are advancing rapidly, and they’re expected to be ready for the spotlight in the next several years.

However, some people in the meat industry insist that the two aren’t the same, and that terms like “clean meat” are misleading to consumers. Missouri, for example, has banned any non-livestock products from being labeled as “meat.” Cell-cultured meat entrepreneurs are also facing the typical hurdle of a regulatory system that just isn’t ready for them. It was only last year that it was even decided which federal agency should have jurisdiction over their industry.

  1. How far is too far with facial recognition?

Facial recognition software is one of the most controversial technologies of the early 21st century. Amazon found itself caught in the fray recently when it filed a patent for doorbell cameras with facial recognition features that could automatically call the police under certain circumstances.

The backlash was swift and intense as consumers pointed out major privacy concerns over a privately-run facial database with the power to initiate a police response. Amazon assured the public that they were addressing the issue, and there’s been no indication of whether they will actually even take the devices into production. But it’s a reminder of the enormous power that the eCommerce titan wields.

  1. How can you keep bias out of algorithms?

You might think computers and data are, by definition, unbiased—but you’d be wrong. Many social scientists have shown that the beliefs and prejudices of the algorithm designers tend to be replicated in the algorithms, with potentially severe consequences.

Remember those door cameras we just discussed? Science has shown that facial recognition technology tends to be less accurate on women and people of color. And there are numerous examples beyond that, from crime-prediction algorithms to technologies designed to select candidates for job interviews. Tech leaders will need to step up and do the hard work of identifying biases that have crept into what should be objective systems but all too often aren’t.

  1. Social media: platform or publishing?

Of all the issues on this list, this one might be the closest to the point of something about to give. For years, social media sites like Facebook have had their cake and eaten it, too. They’ve argued in court that they should be treated as content publishers with a degree of editorial control over what they publish.

But they’ve also argued publicly that they should be treated as a platform—a neutral ground for free speech, which doesn’t have the same responsibilities as a publisher. In the political mudslinging that’s consumed social media, a breaking point seems inevitable, and Facebook may soon have to choose how it wants to be classified.

  1. How will drones be regulated?

For several years, the FAA has been seriously slow-walking entrepreneurs enthused about the new possibilities of drone-based commerce. The current drone regulations—no night flying, no flying above 400 feet and, perhaps most importantly, no flying out of visual range—are badly in need of a revamp as the industry gets ready for a major expansion.

It’s another case in which the regulatory apparatus couldn’t keep up with the pace of technological innovation. As with self-driving cars, however, drones have such potential to change commerce and society that it’s critical to get the rules right on the first try.

As innovative electronics solutions continue to transform the world, it’s ironically more important than ever to be patient. Courts, regulatory systems and technological innovators will continue to find productive ways to work together, as they always have—and the world will be better for it. While you’re waiting, the best practice is always to know your mission and know your values and to use those to guide your business decisions.

Read more

More News