Launched in January 2018, leaders from academia and industry in Artificial Intelligence, Education, Research, Finance, Community and Social Impact Nonprofits banded together to create a group that would be focused on “Creating Opportunity for LatinX in AI.”
Artificial Intelligence has the potential to displace workers of marginalized populations including those of Latinx origin. AI is already perpetuating social bias and prejudice because it lacks representation of LatinX professionals in the AI industry. Machine learning algorithms can encode a discriminative bias during training with real-world data in which underrepresented groups are not properly characterized or represented. A question quickly emerges: how can we make sure Machine Learning does not discriminate against people from minority groups because of the color of their skin, gender, ethnicity, or historically unbalanced power structures in society?
Even more, as the tech industry does not represent the entire population, underrepresented populations in computing such as Hispanics, women, African-Americans, and Native Americans have limited control over the direction of machine learning breakthroughs. As an ethnicity, the Latinx population is an interesting case study for this research as members are comprised of all skin tones with a wide regional distribution across the world.
In this session, we claim that it is our responsibility to advance the progress of machine learning by increasing the presence of members of our minority group that are able to build solutions and algorithms to advance the progress of this field towards a direction in which AI is being used to solve problems in our communities while bias and unfairness are accordingly addressed. As the number of Hispanic and Latinx identifying AI practitioners increases, it is also imperative for us to have access to share our work at international AI and Machine Learning conferences which yield new opportunities for collaboration, funding, and job prospects we would not have access to otherwise.