AS ARTIFICIAL INTELLIGENCE (AI) worms its way into many areas of life, society will need to become comfortable with algorithms, not people, making decisions. The systems have already shown promise in areas ranging from banking to e-commerce, healthcare and policing. Yet worries grow that the algorithms may take on too much control—especially if people forfeit decision-making to machines, such as with self-driving cars or courtroom sentencing. If this prevents AI’s use, then there is a risk that society and the economy will fail to receive its potential benefits. Hannah Fry has studied these systems for years as a mathematician focusing on urban issues at the Centre for Advanced Spatial Analysis at University College London. However she is better known as a great populariser of maths and sciences through her public lectures and documentaries on the BBC. In her latest book, “Hello World,” Ms Fry demystifies how the technologies work, looks back into history to explain how we came to adopt data-driven decisions and offers a clear-eyed analysis of the pros and cons. The benefit is that AI can often perform tasks more quickly and accurately than people; the drawback is that if the data are biased, then the output may… Read full this story
- The AFL’s administration has failed and it must be changed
- NHTSA Fines Ferrari $3.5 Million for Failing to Submit Reports
- Beyond Big Data: Why Human Interpretation Still Counts
- Who’s Really to Blame for Robot-Human Crashes? Are We Really Such Awful Drivers?
- Monster Roster’s algorithm predicts what’s next on The Bachelor
- 3 Ways to Survive Instagram's Algorithm Update
- Can the Roboracer improve actual human racing?
- Watch DURUS Robot Walking Like A Human
- Beyond Human Resources: 4 Ways to Improve Human Capital Management
- Maintenance ignorance costing drivers £326 each at MOT time
Algorithms should take into account, not ignore, human failings have 298 words, post on www.economist.com at April 8, 2019. This is cached page on Vietnam Colors. If you want remove this page, please contact us.