In the 2019 New Year, the first war of words in the field of artificial intelligence has already started on Twitter. This time, the theme was triggered by a mistake on the media website Venturebeat.
In this war of words, Yann LeCun, Andrew Ng and nobel laureate James D. Watson were all named and eventually pointed to gender bias in AI workplace.
This time, Anima Anandkumar was the main character of the tear. Although she is not as famous as the leaders in the first echelon, she is also a very important figure in the field of machine learning.
Anima is currently the director of machine learning research in NVIDIA and is also a tenured professor in the Department of Computer and Mathematics of California Institute of Technology.
Before Avida, Anima worked as chief scientist at AWS and worked at MIT and Microsoft Research Institute.
Her research involves large-scale learning, in-depth learning, probability model and nonconvex optimization, etc. It is also a professional review of NeurIPS. Anima has also made a lot of efforts to change the name of NeurIPS.
Just today, Anima published and responded to nearly 50 Twitter posts and joined other netizens in a frantic protest against the media website Venturebeat, using a photo in a press release that was improperly handled.
In addition to this photo, Anima also expressed angrily the gender discrimination that has always existed in the field of artificial intelligence research and the treatment she has suffered.
The source of the quarrel: headless AI female scientist
On January 2, Venturebeat, a well-known US technology media, published a news article on its website, which is about the prediction of four AI leaders on the development of artificial intelligence in 2019. It mentioned Yann Lecun, Andrew Ng, Hilary Mason and Rumman Chowdhury’s views on the development of artificial intelligence this year.
The title and content of this news did not have any big problems, but the picture was badly flawed.
On Venturebeat’s website, in the accompanying picture of this article, the heads of two female scientists were just cut off.
The twoSee only its body, not its headHilary Mason, head of machine learning at Cloudera, a cloud service giant, and Rumman Chowdhury, a senior supervisor at Accenture’s responsible agency for artificial intelligence.
First of all, netizens discovered the problem and sent a message:
It also makes many netizens feel that no matter what the reason, it is not appropriate for a media website to have such a matching picture. However, until now, this article has been published in Venturebeat (https://venturebeat.com/) is still unchanged.
The female scientists blx?
Discussions on this matter have been going on since yesterday afternoon, China time. Anima made her voice heard on Twitter about this article. She thoughtVenturebeat’s behavior is very inappropriate and should be revised as soon as possible.
One stone stirred up a thousand layers of waves, and another AI Dara jumped out and said it wasTwitter should take the blame. It’s just not enough to capture the pictures.
The man who jumped out was Thomas G. Dietterich. Among his Twitter followers were Andrew Ng, Li Feifei and Lecun. Thomas was also an active participant in the verbal battle about “AI ethics” triggered by Lecun some time ago.
Anima stood up and said that she thought Venturebeat’s mistake in this article was also due to the fact that women have been neglected in the science and technology circle. He also cited several workplace examples he encountered, such as:
In an interview with Nature, she talked about#protestNIPSTopics (NIPS renamed NeurIPS) and@InclusionInML(An organization that advocates improving inclusiveness and eliminating prejudice in machine learning), Nature did not emphasize the contribution of female scientists in these work, but spent a lot of ink to write Sofia, the online red robot.
She believes that in order to attract attention to hot spots, the media also uses the perspective of a male-dominated society to deal with news content, which is also one of the reasons why women are not paid attention to during the science and technology festival.
NeurIPS’ desire to survive: increasing female presidents
Compared with Venturebeat’s minor editorial mistake, NeurIPS showed his great desire to survive last year.
In the recently concluded conference, NeurIPS not only changed its name, which has been in use for many years, but also added two female presidents to the four conference programs besides the president.
The chairman of the conference project is the post next to the chairman of the conference, which includes one person in charge and three co-chairmen. Hanna Wallach (38 years old) of Microsoft Research Institute is the chairman of the conference this year.
In addition to her many achievements in AI as an engineer, Hanna is also one of the conference sponsors of the Women in the Machine Learning Conference.
This meeting originated more than ten years ago.When Hanna attended the NIPS, she found that there were only 4 female engineers present.The idea of a women’s conference on machine learning came into being.
Project leaders such as Hanna Wallach, who have a good sense of perspective and background, are the best spokesmen for NeurIPS to wash himself white as soon as possible.
Another important co-head of the conference program is Kristen Grauman(39) from Facebook AI. She is the winner of the 2011 Marr Prize and has been engaged in research in the famous CSAIL (Computer Vision and Machine Learning Laboratory of MIT). She has published 7 CVPR and 4 ICCV articles in one year with good results.
Her main research directions are computer vision and machine learning, and to be more precise, visual search and object recognition.
Her most influential research achievement: Pyramid Match Kernel is an algorithm for image matching. Her 2005 collaborative paper “The Pyramid Match Kernel: Discriminant Class with Sets of Image Features” is one of the classic algorithms in this field.
Many male researchers can’t match her academic high yield and high quality.
Eliminate the prejudice of AI, but also eliminate the prejudice of society.
In the engineering industry, the gender ratio is already very different, and it is not often that female engineers or scientists can stand out.
Some time ago, the AI tool used by Amazon to screen resumes was found to be suspected of gender discrimination and was immediately taken off the line by the authorities. Although it was said afterwards that this was a trained mistake, it also objectively showed that women were still a minority and neglected group in this field.
The AI program based on data learning, the data and models it learns come from real society. If this field is generally male perspective, even unconscious transmission will make AI learn the prejudice of the crowd.
However, with the development of society, more and more female scientists have emerged in AI field. People’s attention also makes women occupy a more important position in this field.