True is the correct answer
Explanation:-
For better understanding let's take an example:-
- In 2018, Reuters reported that Amazon had been working on an AI recruiting system designed to streamline the recruitment process by reading resumes and selecting the best-qualified candidate.
- Unfortunately, the AI seemed to have a serious problem with women, and it emerged that the algorithm had been programmed to replicate existing hiring practices, meaning it also replicated their biases.
- The AI picked up on uses of “women’s” such as “women’s chess club captain” and marked the resumes down on the scoring system. Reuters learned that “In effect, Amazon’s system taught itself that male candidates were preferable.”
The above example, clearly shows that The quality of your model is usually a direct result of the quality and quantity of your data.
Study more about AI Bias at AI Bias Class 10