Ethical AI Blog
Here we bring to you the stories of ethics, artificial intelligence and what happens when robots go bad, are unethical or work against humanity…
The purpose of ethics and the law are often distinct yet the EU is on a path to turn ethical principles into legal rules. Is this the right approach?
Twitter is re-evaluating an image cropping algorithm after evidence has emerged that the technology seemingly favored images of white individuals while hiding those of people of color.
Direct discrimination occurs when somebody is treated unfavourably because of an attribute such as age, disability, race, sexuality etc
After being sued by two groups, the United Kingdom’s Home Office has agreed to halt its use of, and substantially redesign, an algorithm that it had been using to analyze and support visa applications.
Procedural fairness is concerned with the procedures used by a decision maker, rather than the actual outcome reached.
If you are a parent in Australia and put bowls of ice cream in front of two siblings, the first thing they do is examine the quantity of ice cream in the other’s bowl.
“Deepfakes” – AI generated fake images, videos, and audio files – are becoming more commonplace as their proliferation across the internet explodes.
The algorithm failed to account for more than half of Black patients who should have been categorised as “high risk.”
A toolkit can make all the difference when it comes to the application of ethical principles.
Singapore has been a significant contributor to the global discussion on the ethics of AI – recently releasing three documents for trade associations and chambers, professional bodies, and interest groups for discussion, and adaption for their own use.
Bad Robots: Global Exam-Grading Software In Trouble For Algorithm Bias International Baccalaureate Program’s Exam-Grading Algorithm May Have Adversely Impacted Test Scores of Low-Income & Minority Students
Amazon’s AI-enabled recruitment software tool “downgraded” resumes of job seekers that contained the word “women” or that otherwise implied the applicant was a woman.